https://github.com/timeseriesai/tsai

Time series Timeseries Deep Learning Machine Learning Python Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai

https://github.com/timeseriesai/tsai

Science Score: 46.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
  • DOI references
    Found 2 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org, sciencedirect.com, ieee.org, acm.org, zenodo.org
  • Committers with academic emails
    3 of 23 committers (13.0%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (18.2%) to scientific vocabulary

Keywords

classification cnn deep-learning fastai forecasting inceptiontime machine-learning python pytorch regression rnn rocket self-supervised sequential state-of-the-art time-series time-series-analysis time-series-classification timeseries transformer

Keywords from Contributors

pypi embedded interactive projection sequences data-profilers datacleaner pipeline-testing genomics observability
Last synced: 5 months ago · JSON representation

Repository

Time series Timeseries Deep Learning Machine Learning Python Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai

Basic Info
Statistics
  • Stars: 5,777
  • Watchers: 66
  • Forks: 700
  • Open Issues: 119
  • Releases: 15
Topics
classification cnn deep-learning fastai forecasting inceptiontime machine-learning python pytorch regression rnn rocket self-supervised sequential state-of-the-art time-series time-series-analysis time-series-classification timeseries transformer
Created over 6 years ago · Last pushed 7 months ago
Metadata Files
Readme Changelog Contributing License Code of conduct

README.md

tsai



CI PyPI Conda (channel
only) DOI PRs

Description

State-of-the-art Deep Learning library for Time Series and Sequences.

tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series tasks like classification, regression, forecasting, imputation…

tsai is currently under active development by timeseriesAI.

What’s new:

During the last few releases, here are some of the most significant additions to tsai:

  • New models: PatchTST (Accepted by ICLR 2023), RNN with Attention (RNNAttention, LSTMAttention, GRUAttention), TabFusionTransformer, …
  • New datasets: we have increased the number of datasets you can download using tsai:
    • 128 univariate classification datasets
    • 30 multivariate classification datasets
    • 15 regression datasets
    • 62 forecasting datasets
    • 9 long term forecasting datasets
  • New tutorials: PatchTST. Based on some of your requests, we are planning to release additional tutorials on data preparation and forecasting.
  • New functionality: sklearn-type pipeline transforms, walk-foward cross validation, reduced RAM requirements, and a lot of new functionality to perform more accurate time series forecasts.
  • Pytorch 2.0 support.

Installation

Pip install

You can install the latest stable version from pip using:

python pip install tsai

If you plan to develop tsai yourself, or want to be on the cutting edge, you can use an editable install. First install PyTorch, and then:

python git clone https://github.com/timeseriesAI/tsai pip install -e "tsai[dev]"

Note: starting with tsai 0.3.0 tsai will only install hard dependencies. Other soft dependencies (which are only required for selected tasks) will not be installed by default (this is the recommended approach. If you require any of the dependencies that is not installed, tsai will ask you to install it when necessary). If you still want to install tsai with all its dependencies you can do it by running:

python pip install tsai[extras]

Conda install

You can also install tsai using conda (note that if you replace conda with mamba the install process will be much faster and more reliable):

python conda install -c timeseriesai tsai

Documentation

Here’s the link to the documentation.

Available models:

Here’s a list with some of the state-of-the-art models available in tsai:

plus other custom models like: TransformerModel, LSTMAttention, GRUAttention, …

How to start using tsai?

To get to know the tsai package, we’d suggest you start with this notebook in Google Colab: 01IntrotoTimeSeries_Classification It provides an overview of a time series classification task.

We have also develop many other tutorial notebooks.

To use tsai in your own notebooks, the only thing you need to do after you have installed the package is to run this:

python from tsai.all import *

Examples

These are just a few examples of how you can use tsai:

Binary, univariate classification

Training:

``` python from tsai.basics import *

X, y, splits = getclassificationdata('ECG200', splitdata=False) tfms = [None, TSClassification()] batchtfms = TSStandardize() clf = TSClassifier(X, y, splits=splits, path='models', arch="InceptionTimePlus", tfms=tfms, batchtfms=batchtfms, metrics=accuracy, cbs=ShowGraph()) clf.fitonecycle(100, 3e-4) clf.export("clf.pkl") ```

Inference:

``` python from tsai.inference import load_learner

clf = loadlearner("models/clf.pkl") probas, target, preds = clf.getX_preds(X[splits[1]], y[splits[1]]) ```

Multi-class, multivariate classification

Training:

``` python from tsai.basics import *

X, y, splits = getclassificationdata('LSST', splitdata=False) tfms = [None, TSClassification()] batchtfms = TSStandardize(bysample=True) mvclf = TSClassifier(X, y, splits=splits, path='models', arch="InceptionTimePlus", tfms=tfms, batchtfms=batchtfms, metrics=accuracy, cbs=ShowGraph()) mvclf.fitonecycle(10, 1e-2) mvclf.export("mv_clf.pkl") ```

Inference:

``` python from tsai.inference import load_learner

mvclf = loadlearner("models/mvclf.pkl") probas, target, preds = mvclf.getXpreds(X[splits[1]], y[splits[1]]) ```

Multivariate Regression

Training:

``` python from tsai.basics import *

X, y, splits = getregressiondata('AppliancesEnergy', splitdata=False) tfms = [None, TSRegression()] batchtfms = TSStandardize(bysample=True) reg = TSRegressor(X, y, splits=splits, path='models', arch="TSTPlus", tfms=tfms, batchtfms=batchtfms, metrics=rmse, cbs=ShowGraph(), verbose=True) reg.fitone_cycle(100, 3e-4) reg.export("reg.pkl") ```

Inference:

``` python from tsai.inference import load_learner

reg = loadlearner("models/reg.pkl") rawpreds, target, preds = reg.getXpreds(X[splits[1]], y[splits[1]]) ```

The ROCKETs (RocketClassifier, RocketRegressor, MiniRocketClassifier, MiniRocketRegressor, MiniRocketVotingClassifier or MiniRocketVotingRegressor) are somewhat different models. They are not actually deep learning models (although they use convolutions) and are used in a different way.

⚠️ You’ll also need to install sktime to be able to use them. You can install it separately:

python pip install sktime

or use:

python pip install tsai[extras]

Training:

``` python from sklearn.metrics import meansquarederror, makescorer from tsai.data.external import getMonashregressiondata from tsai.models.MINIROCKET import MiniRocketRegressor

Xtrain, ytrain, *_ = getMonashregressiondata('AppliancesEnergy') rmsescorer = makescorer(meansquarederror, greaterisbetter=False) reg = MiniRocketRegressor(scoring=rmsescorer) reg.fit(Xtrain, ytrain) reg.save('MiniRocketRegressor') ```

Inference:

``` python from sklearn.metrics import meansquarederror from tsai.data.external import getMonashregressiondata from tsai.models.MINIROCKET import loadminirocket

*, Xtest, ytest = getMonashregressiondata('AppliancesEnergy') reg = loadminirocket('MiniRocketRegressor') ypred = reg.predict(Xtest) meansquarederror(ytest, y_pred, squared=False) ```

Forecasting

You can use tsai for forecast in the following scenarios:

  • univariate or multivariate time series input
  • univariate or multivariate time series output
  • single or multi-step ahead

You’ll need to: * prepare X (time series input) and the target y (see documentation) * select PatchTST or one of tsai’s models ending in Plus (TSTPlus, InceptionTimePlus, TSiTPlus, etc). The model will auto-configure a head to yield an output with the same shape as the target input y.

Single step

Training:

``` python from tsai.basics import *

ts = getforecastingtimeseries("Sunspots").values X, y = SlidingWindow(60, horizon=1)(ts) splits = TimeSplitter(235)(y) tfms = [None, TSForecasting()] batchtfms = TSStandardize() fcst = TSForecaster(X, y, splits=splits, path='models', tfms=tfms, batchtfms=batchtfms, bs=512, arch="TSTPlus", metrics=mae, cbs=ShowGraph()) fcst.fitonecycle(50, 1e-3) fcst.export("fcst.pkl") ```

Inference:

``` python from tsai.inference import load_learner

fcst = loadlearner("models/fcst.pkl", cpu=False) rawpreds, target, preds = fcst.getXpreds(X[splits[1]], y[splits[1]]) raw_preds.shape

torch.Size([235, 1])

```

Multi-step

This example show how to build a 3-step ahead univariate forecast.

Training:

``` python from tsai.basics import *

ts = getforecastingtimeseries("Sunspots").values X, y = SlidingWindow(60, horizon=3)(ts) splits = TimeSplitter(235, fcsthorizon=3)(y) tfms = [None, TSForecasting()] batchtfms = TSStandardize() fcst = TSForecaster(X, y, splits=splits, path='models', tfms=tfms, batchtfms=batchtfms, bs=512, arch="TSTPlus", metrics=mae, cbs=ShowGraph()) fcst.fitone_cycle(50, 1e-3) fcst.export("fcst.pkl") ```

Inference:

``` python from tsai.inference import loadlearner fcst = loadlearner("models/fcst.pkl", cpu=False) rawpreds, target, preds = fcst.getXpreds(X[splits[1]], y[splits[1]]) rawpreds.shape

torch.Size([235, 3])

```

Input data format

The input format for all time series models and image models in tsai is the same. An np.ndarray (or array-like object like zarr, etc) with 3 dimensions:

[# samples x # variables x sequence length]

The input format for tabular models in tsai (like TabModel, TabTransformer and TabFusionTransformer) is a pandas dataframe. See example.

How to contribute to tsai?

We welcome contributions of all kinds. Development of enhancements, bug fixes, documentation, tutorial notebooks, …

We have created a guide to help you start contributing to tsai. You can read it here.

Enterprise support and consulting services:

Want to make the most out of timeseriesAI/tsai in a professional setting? Let us help. Send us an email to learn more: info@timeseriesai.co

Citing tsai

If you use tsai in your research please use the following BibTeX entry:

text @Misc{tsai, author = {Ignacio Oguiza}, title = {tsai - A state-of-the-art deep learning library for time series and sequential data}, howpublished = {Github}, year = {2023}, url = {https://github.com/timeseriesAI/tsai} }

Owner

  • Name: timeseriesAI
  • Login: timeseriesAI
  • Kind: organization
  • Email: timeseriesAI@gmail.com
  • Location: Spain

State-of-the-art deep learning applied to time series data

GitHub Events

Total
  • Create event: 5
  • Release event: 1
  • Issues event: 23
  • Watch event: 567
  • Delete event: 3
  • Issue comment event: 33
  • Push event: 16
  • Pull request event: 8
  • Fork event: 65
Last Year
  • Create event: 5
  • Release event: 1
  • Issues event: 23
  • Watch event: 567
  • Delete event: 3
  • Issue comment event: 33
  • Push event: 16
  • Pull request event: 8
  • Fork event: 65

Committers

Last synced: 11 months ago

All Time
  • Total Commits: 1,153
  • Total Committers: 23
  • Avg Commits per committer: 50.13
  • Development Distribution Score (DDS): 0.393
Past Year
  • Commits: 8
  • Committers: 1
  • Avg Commits per committer: 8.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
“oguiza” “****a@g****” 700
Ignacio Oguiza 1****a 383
filipj8 f****j@u****u 8
dnth d****h@g****m 7
J-M j****d@c****u 6
dependabot[bot] 4****] 6
Victor Rodriguez-Fernandez v****z@u****s 6
Adam Golinski a****i@a****m 4
Georg Heiler g****r@g****m 4
Radi Cho r****3@g****m 4
williamsdoug d****i@g****m 3
Sam s****m@i****o 3
Ideapad-310 m****n@g****m 3
Craig Versek c****b@g****m 3
deven-gqc d****n@g****m 2
Raghu Kainkaryam r****m@e****m 2
imilas a****i@u****a 2
yangtzech y****h@q****m 2
David Muhr m****d@g****m 1
Jeff H j****s@g****m 1
Michael Li m@v****n 1
Sachdeva, Kapil k****7@g****m 1
Thomas Capelle t****e@p****e 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 222
  • Total pull requests: 22
  • Average time to close issues: about 1 month
  • Average time to close pull requests: 10 days
  • Total issue authors: 153
  • Total pull request authors: 11
  • Average comments per issue: 2.06
  • Average comments per pull request: 1.45
  • Merged pull requests: 13
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 23
  • Pull requests: 8
  • Average time to close issues: 27 minutes
  • Average time to close pull requests: 22 minutes
  • Issue authors: 21
  • Pull request authors: 3
  • Average comments per issue: 0.74
  • Average comments per pull request: 0.5
  • Merged pull requests: 4
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • oguiza (29)
  • alitirmizi23 (4)
  • E-Penguin (4)
  • ramdhan1989 (4)
  • YYKKKKXX (4)
  • strakehyr (3)
  • langslike (3)
  • Loccret (3)
  • liyy2 (2)
  • yifeiSunny (2)
  • MichaelCarrik (2)
  • lo-zed (2)
  • connormeaton (2)
  • Willtl (2)
  • TDL77 (2)
Pull Request Authors
  • oguiza (7)
  • vrodriguezf (3)
  • TarikVon (3)
  • bankeiyotaku (2)
  • andersgb (2)
  • tbohne (2)
  • cversek (2)
  • yangtzech (1)
  • shostykovich21 (1)
  • eltociear (1)
  • talesa (1)
  • deven367 (1)
Top Labels
Issue Labels
question (35) bug (30) enhancement (24) under review (24) answered? (18) ideas (6) fixed? (6) duplicate (5) dependencies (5) documentation (4) help wanted (3) tutorial nb (2) not an issue (2) high-priority (2) PR (2) wontfix (2)
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 14,598 last-month
  • Total dependent packages: 5
  • Total dependent repositories: 18
  • Total versions: 50
  • Total maintainers: 2
pypi.org: tsai

Practical Deep Learning for Time Series / Sequential Data library based on fastai & Pytorch

  • Versions: 50
  • Dependent Packages: 5
  • Dependent Repositories: 18
  • Downloads: 14,598 Last month
  • Docker Downloads: 0
Rankings
Stargazers count: 1.1%
Forks count: 2.3%
Dependent packages count: 2.4%
Average: 2.7%
Downloads: 2.7%
Dependent repos count: 3.4%
Docker downloads count: 4.3%
Maintainers (2)
Last synced: 6 months ago