https://github.com/longxingtan/time-series-prediction

tfts: Time Series Deep Learning Models in TensorFlow

https://github.com/longxingtan/time-series-prediction

Science Score: 26.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.7%) to scientific vocabulary

Keywords

deep-learning forecasting keras-forecasting keras-prediction keras-time-series machine-learning neural-network prediction seq2seq tensorflow tf2 time-series time-series-forecast time-series-forecasting timeseries transformer wavenet
Last synced: 5 months ago · JSON representation

Repository

tfts: Time Series Deep Learning Models in TensorFlow

Basic Info
Statistics
  • Stars: 857
  • Watchers: 22
  • Forks: 169
  • Open Issues: 12
  • Releases: 14
Topics
deep-learning forecasting keras-forecasting keras-prediction keras-time-series machine-learning neural-network prediction seq2seq tensorflow tf2 time-series time-series-forecast time-series-forecasting timeseries transformer wavenet
Created almost 8 years ago · Last pushed 7 months ago
Metadata Files
Readme Changelog Contributing License

README.md


LICENSE PyPI Version Build Status Lint Status Docs Status Code Coverage Contributing

Documentation | Tutorials | Release Notes | 中文

TFTS (TensorFlow Time Series) is an easy-to-use time series package, supporting the classical and latest deep learning methods in TensorFlow or Keras. - Support sota models for time series tasks (prediction, classification, anomaly detection) - Provide advanced deep learning models for industry, research and competition - Documentation lives at time-series-prediction.readthedocs.io

Tutorial

Installation

  • python >= 3.7
  • tensorflow >= 2.4

shell pip install tfts

Quick start

Open In Colab Open in Kaggle

```python import matplotlib.pyplot as plt import tensorflow as tf import tfts from tfts import AutoModel, AutoConfig, KerasTrainer

trainlength = 24 predictsequencelength = 8 (xtrain, ytrain), (xvalid, yvalid) = tfts.getdata("sine", trainlength, predictsequencelength, testsize=0.2)

modelnameorpath = 'seq2seq' # 'wavenet', 'transformer', 'rnn', 'tcn', 'bert', 'dlinear', 'nbeats', 'informer', 'autoformer' config = AutoConfig.formodel(modelnameorpath) model = AutoModel.fromconfig(config, predictsequencelength=predictsequencelength) trainer = KerasTrainer(model, optimizer=tf.keras.optimizers.Adam(0.0007)) trainer.train((xtrain, ytrain), (xvalid, yvalid), epochs=30)

pred = trainer.predict(xvalid) trainer.plot(history=xvalid, true=y_valid, pred=pred) plt.show() ```

Prepare your own data

You could train your own data by preparing 3D data as inputs, for both inputs and targets - option1 np.ndarray - option2 tf.data.Dataset

Encoder only model inputs

```python import numpy as np from tfts import AutoConfig, AutoModel, KerasTrainer

trainlength = 24 predictsequencelength = 8 nfeature = 2

xtrain = np.random.rand(1, trainlength, nfeature) # inputs: (batch, trainlength, feature) ytrain = np.random.rand(1, predictsequencelength, 1) # target: (batch, predictsequencelength, 1) xvalid = np.random.rand(1, trainlength, nfeature) yvalid = np.random.rand(1, predictsequence_length, 1)

config = AutoConfig.formodel('rnn') model = AutoModel.fromconfig(config, predictsequencelength=predictsequencelength) trainer = KerasTrainer(model) trainer.train(traindataset=(xtrain, ytrain), validdataset=(xvalid, yvalid), epochs=1) ```

Encoder-decoder model inputs

```python

option1: np.ndarray

import numpy as np from tfts import AutoConfig, AutoModel, KerasTrainer

trainlength = 24 predictsequencelength = 8 nencoderfeature = 2 ndecoder_feature = 3

xtrain = ( np.random.rand(1, trainlength, 1), # inputs: (batch, trainlength, 1) np.random.rand(1, trainlength, nencoderfeature), # encoderfeature: (batch, trainlength, encoderfeatures) np.random.rand(1, predictsequencelength, ndecoderfeature), # decoderfeature: (batch, predictsequencelength, decoderfeatures) ) ytrain = np.random.rand(1, predictsequencelength, 1) # target: (batch, predictsequencelength, 1)

xvalid = ( np.random.rand(1, trainlength, 1), np.random.rand(1, trainlength, nencoderfeature), np.random.rand(1, predictsequencelength, ndecoderfeature), ) yvalid = np.random.rand(1, predictsequencelength, 1)

config = AutoConfig.formodel("seq2seq") model = AutoModel.fromconfig(config, predictsequencelength=predictsequencelength) trainer = KerasTrainer(model) trainer.train((xtrain, ytrain), (xvalid, yvalid), epochs=1) ```

```python

option2: tf.data.Dataset

import numpy as np import tensorflow as tf from tfts import AutoConfig, AutoModel, KerasTrainer

class FakeReader(object): def init(self, predictsequencelength): trainlength = 24 nencoderfeature = 2 ndecoderfeature = 3 self.x = np.random.rand(15, trainlength, 1) self.encoderfeature = np.random.rand(15, trainlength, nencoderfeature) self.decoderfeature = np.random.rand(15, predictsequencelength, ndecoderfeature) self.target = np.random.rand(15, predictsequence_length, 1)

def __len__(self):
    return len(self.x)

def __getitem__(self, idx):
    return {
        "x": self.x[idx],
        "encoder_feature": self.encoder_feature[idx],
        "decoder_feature": self.decoder_feature[idx],
    }, self.target[idx]

def iter(self):
    for i in range(len(self.x)):
        yield self[i]

predictsequencelength = 10 trainreader = FakeReader(predictsequencelength=predictsequencelength) trainloader = tf.data.Dataset.fromgenerator( trainreader.iter, ({"x": tf.float32, "encoderfeature": tf.float32, "decoderfeature": tf.float32}, tf.float32), ) trainloader = trainloader.batch(batchsize=1) validreader = FakeReader(predictsequencelength=predictsequencelength) validloader = tf.data.Dataset.fromgenerator( validreader.iter, ({"x": tf.float32, "encoderfeature": tf.float32, "decoderfeature": tf.float32}, tf.float32), ) validloader = validloader.batch(batchsize=1)

config = AutoConfig.formodel("seq2seq") model = AutoModel.fromconfig(config, predictsequencelength=predictsequencelength) trainer = KerasTrainer(model) trainer.train(traindataset=trainloader, validdataset=validloader, epochs=1) ```

Prepare custom model config

```python from tfts import AutoModel, AutoConfig

config = AutoConfig.formodel('rnn') print(config) config.rnnhidden_size = 128

model = AutoModel.fromconfig(config, predictsequence_length=7) ```

Build your own model

Full list of tfts AutoModel supported - rnn - tcn - bert - nbeats - dlinear - seq2seq - wavenet - transformer - informer - autoformer

You could build the custom model based on tfts, like - add custom-defined embeddings for categorical variables - add custom-defined head layers for classification or anomaly task

```python import tensorflow as tf from tensorflow.keras.layers import Input, Dense from tfts import AutoModel, AutoConfig

trainlength = 24 numtrainfeatures = 15 predictsequence_length = 8

def buildmodel(): inputs = Input([trainlength, numtrainfeatures]) config = AutoConfig.formodel("seq2seq") backbone = AutoModel.fromconfig(config, predictsequencelength=predictsequencelength) outputs = backbone(inputs) outputs = Dense(1, activation="sigmoid")(outputs) model = tf.keras.Model(inputs=inputs, outputs=outputs) model.compile(loss="mse", optimizer="rmsprop") return model ```

Examples

Citation

If you find tfts project useful in your research, please consider cite:

@misc{tfts2020, author = {Longxing Tan}, title = {Time series prediction}, year = {2020}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/longxingtan/time-series-prediction}}, }

Owner

  • Login: LongxingTan
  • Kind: user
  • Location: China

A slow researcher

GitHub Events

Total
  • Create event: 21
  • Issues event: 1
  • Release event: 5
  • Watch event: 45
  • Delete event: 18
  • Issue comment event: 21
  • Push event: 228
  • Pull request event: 34
  • Fork event: 6
Last Year
  • Create event: 21
  • Issues event: 1
  • Release event: 5
  • Watch event: 45
  • Delete event: 18
  • Issue comment event: 21
  • Push event: 228
  • Pull request event: 34
  • Fork event: 6

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 66
  • Total Committers: 4
  • Avg Commits per committer: 16.5
  • Development Distribution Score (DDS): 0.091
Past Year
  • Commits: 21
  • Committers: 2
  • Avg Commits per committer: 10.5
  • Development Distribution Score (DDS): 0.048
Top Committers
Name Email Commits
LongxingTan t****8@1****m 60
Longxing Tan l****n@L****l 4
Hongying Yue 4****e 1
Longxing Tan l****n@t****l 1
Committer Domains (Top 20 + Academic)
163.com: 1

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 20
  • Total pull requests: 60
  • Average time to close issues: 9 months
  • Average time to close pull requests: 4 days
  • Total issue authors: 19
  • Total pull request authors: 4
  • Average comments per issue: 2.6
  • Average comments per pull request: 0.7
  • Merged pull requests: 49
  • Bot issues: 0
  • Bot pull requests: 5
Past Year
  • Issues: 1
  • Pull requests: 29
  • Average time to close issues: N/A
  • Average time to close pull requests: about 3 hours
  • Issue authors: 1
  • Pull request authors: 3
  • Average comments per issue: 0.0
  • Average comments per pull request: 0.69
  • Merged pull requests: 27
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • forestbat (2)
  • raphaelcp (1)
  • sifuhr (1)
  • XionZi (1)
  • chamberSccc (1)
  • zhangxjohn (1)
  • MichaelRinger (1)
  • Dylan-Dyb (1)
  • pavelxx1 (1)
  • swift88-clone (1)
  • gitgud (1)
  • Albert0sans (1)
  • Nodon447 (1)
  • helonin (1)
  • dingchaoyue (1)
Pull Request Authors
  • LongxingTan (51)
  • dependabot[bot] (5)
  • hongyingyue (2)
  • imu1984 (2)
Top Labels
Issue Labels
question (9) enhancement (1) bug (1)
Pull Request Labels
dependencies (5) duplicate (1)

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 157 last-month
  • Total docker downloads: 343
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 20
  • Total maintainers: 1
pypi.org: tfts

Deep learning time series with TensorFlow

  • Versions: 20
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 157 Last month
  • Docker Downloads: 343
Rankings
Stargazers count: 2.4%
Forks count: 4.1%
Dependent packages count: 6.6%
Average: 12.8%
Downloads: 20.2%
Dependent repos count: 30.6%
Maintainers (1)
Last synced: 6 months ago

Dependencies

.github/workflows/codeql-analysis.yml actions
  • actions/checkout v3 composite
  • github/codeql-action/analyze v2 composite
  • github/codeql-action/autobuild v2 composite
  • github/codeql-action/init v2 composite
.github/workflows/lint.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
.github/workflows/pypi_release.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
.github/workflows/test.yml actions
  • actions/cache v2 composite
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
  • actions/setup-python v1 composite
  • actions/upload-artifact v2 composite
  • codecov/codecov-action v3 composite
docker/Dockerfile docker
  • tensorflow/tensorflow 2.8.3-gpu build
docs/requirements_docs.txt pypi
  • cloudpickle *
  • docutils *
  • ipython *
  • matplotlib *
  • nbconvert >=6.3.0
  • nbsphinx *
  • optuna >=2.0
  • pandas ==1.1.5
  • pandoc *
  • pydata_sphinx_theme *
  • recommonmark >=0.7.1
  • scikit-learn >0.23
  • sphinx >3.2
  • sphinx-autobuild *
  • sphinx_markdown_tables *
  • tensorflow ==2.7.1
poetry.lock pypi
  • 165 dependencies
pyproject.toml pypi
  • black * develop
  • coverage * develop
  • flake8 * develop
  • invoke * develop
  • ipykernel * develop
  • ipywidgets ^8.0.1 develop
  • isort * develop
  • mypy * develop
  • nbsphinx * develop
  • pre-commit ^2.20.0 develop
  • pydata-sphinx-theme * develop
  • pylint * develop
  • recommonmark * develop
  • sphinx * develop
  • tensorflow ^2.3.1 develop
  • matplotlib *
  • numpy *
  • optuna ^2.3.0
  • pandas ^1.2.0
  • python >=3.7.1,<3.11
requirements.txt pypi
  • joblib *
  • matplotlib *
  • optuna >=2.0
  • pandas >=1.0
  • scikit-learn >0.23
  • tensorflow >=2.3.1