https://github.com/unit8co/darts
A python library for user-friendly forecasting and anomaly detection on time series.
Science Score: 46.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
✓Committers with academic emails
1 of 154 committers (0.6%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (14.9%) to scientific vocabulary
Keywords
Keywords from Contributors
Repository
A python library for user-friendly forecasting and anomaly detection on time series.
Basic Info
- Host: GitHub
- Owner: unit8co
- License: apache-2.0
- Language: Python
- Default Branch: master
- Homepage: https://unit8co.github.io/darts/
- Size: 186 MB
Statistics
- Stars: 8,874
- Watchers: 62
- Forks: 958
- Open Issues: 250
- Releases: 50
Topics
Metadata Files
README.md
Time Series Made Easy in Python

Darts is a Python library for user-friendly forecasting and anomaly detection
on time series. It contains a variety of models, from classics such as ARIMA to
deep neural networks. The forecasting models can all be used in the same way,
using fit() and predict() functions, similar to scikit-learn.
The library also makes it easy to backtest models,
combine the predictions of several models, and take external data into account.
Darts supports both univariate and multivariate time series and models.
The ML-based models can be trained on potentially large datasets containing multiple time
series, and some of the models offer a rich support for probabilistic forecasting.
Darts also offers extensive anomaly detection capabilities. For instance, it is trivial to apply PyOD models on time series to obtain anomaly scores, or to wrap any of Darts forecasting or filtering models to obtain fully fledged anomaly detection models.
Documentation
High Level Introductions
Articles on Selected Topics
- Training Models on Multiple Time Series
- Using Past and Future Covariates
- Temporal Convolutional Networks and Forecasting
- Probabilistic Forecasting
- Transfer Learning for Time Series Forecasting
- Hierarchical Forecast Reconciliation
Quick Install
We recommend to first setup a clean Python environment for your project with Python 3.9+ using your favorite tool (conda, venv, virtualenv with or without virtualenvwrapper).
Once your environment is set up you can install darts using pip:
pip install darts
For more details you can refer to our installation instructions.
Example Usage
Forecasting
Create a TimeSeries object from a Pandas DataFrame, and split it in train/validation series:
```python import pandas as pd from darts import TimeSeries
Read a pandas DataFrame
df = pd.read_csv("AirPassengers.csv", delimiter=",")
Create a TimeSeries, specifying the time and value columns
series = TimeSeries.from_dataframe(df, "Month", "#Passengers")
Set aside the last 36 months as a validation series
train, val = series[:-36], series[-36:] ```
Fit an exponential smoothing model, and make a (probabilistic) prediction over the validation series' duration: ```python from darts.models import ExponentialSmoothing
model = ExponentialSmoothing() model.fit(train) prediction = model.predict(len(val), num_samples=1000) ```
Plot the median, 5th and 95th percentiles: ```python import matplotlib.pyplot as plt
series.plot() prediction.plot(label="forecast", lowquantile=0.05, highquantile=0.95) plt.legend() ```
Anomaly Detection
Load a multivariate series, trim it, keep 2 components, split train and validation sets:
```python from darts.datasets import ETTh2Dataset
series = ETTh2Dataset().load()[:10000][["MUFL", "LULL"]] train, val = series.split_before(0.6) ```
Build a k-means anomaly scorer, train it on the train set and use it on the validation set to get anomaly scores:
```python from darts.ad import KMeansScorer
scorer = KMeansScorer(k=2, window=5) scorer.fit(train) anom_score = scorer.score(val) ```
Build a binary anomaly detector and train it over train scores, then use it over validation scores to get binary anomaly classification:
```python from darts.ad import QuantileDetector
detector = QuantileDetector(highquantile=0.99) detector.fit(scorer.score(train)) binaryanom = detector.detect(anom_score) ```
Plot (shifting and scaling some of the series to make everything appear on the same figure):
```python import matplotlib.pyplot as plt
series.plot() (anomscore / 2. - 100).plot(label="computed anomaly score", c="orangered", lw=3) (binaryanom * 45 - 150).plot(label="detected binary anomaly", lw=4) ```
Features
Forecasting Models: A large collection of forecasting models for regression as well as classification tasks; from statistical models (such as ARIMA) to deep learning models (such as N-BEATS). See the forecasting models below.
Anomaly Detection The
darts.admodule contains a collection of anomaly scorers, detectors and aggregators, which can all be combined to detect anomalies in time series. It is easy to wrap any of Darts forecasting or filtering models to build a fully fledged anomaly detection model that compares predictions with actuals. ThePyODScorermakes it trivial to use PyOD detectors on time series.Multivariate Support:
TimeSeriescan be multivariate - i.e., contain multiple time-varying dimensions/columns instead of a single scalar value. Many models can consume and produce multivariate series.Multiple Series Training (Global Models): All machine learning based models (incl. all neural networks) support being trained on multiple (potentially multivariate) series. This can scale to large datasets too.
Probabilistic Support:
TimeSeriesobjects can (optionally) represent stochastic time series; this can for instance be used to get confidence intervals, and many models support different flavours of probabilistic forecasting (such as estimating parametric distributions or quantiles). Some anomaly detection scorers are also able to exploit these predictive distributions.Conformal Prediction Support: Our conformal prediction models allow to generate probabilistic forecasts with calibrated quantile intervals for any pre-trained global forecasting model.
Past and Future Covariates Support: Many models in Darts support past-observed and/or future-known covariate (external data) time series as inputs for producing forecasts.
Static Covariates Support: In addition to time-dependent data,
TimeSeriescan also contain static data for each dimension, which can be exploited by some models.Hierarchical Reconciliation: Darts offers transformers to perform reconciliation. These can make the forecasts add up in a way that respects the underlying hierarchy.
Regression Models: It is possible to plug-in any scikit-learn compatible model to obtain forecasts as functions of lagged values of the target series and covariates.
Training with Sample Weights: All global models support being trained with sample weights. They can be applied to each observation, forecasted time step and target column.
Forecast Start Shifting: All global models support training and prediction on a shifted output window. This is useful for example for Day-Ahead Market forecasts, or when the covariates (or target series) are reported with a delay.
Explainability: Darts has the ability to explain some forecasting models using Shap values.
Data Processing: Tools to easily apply (and revert) common transformations on time series data (scaling, filling missing values, differencing, boxcox, ...)
Metrics: A variety of metrics for evaluating time series' goodness of fit; from R2-scores to Mean Absolute Scaled Error.
Backtesting: Utilities for simulating historical forecasts, using moving time windows.
PyTorch Lightning Support: All deep learning models are implemented using PyTorch Lightning, supporting among other things custom callbacks, GPUs/TPUs training and custom trainers.
Filtering Models: Darts offers three filtering models:
KalmanFilter,GaussianProcessFilter, andMovingAverageFilter, which allow to filter time series, and in some cases obtain probabilistic inferences of the underlying states/values.Datasets The
darts.datasetssubmodule contains some popular time series datasets for rapid and reproducible experimentation.Compatibility with Multiple Backends:
TimeSeriesobjects can be created from and exported to various backends such as pandas, polars, numpy, pyarrow, xarray, and more, facilitating seamless integration with different data processing libraries.
Forecasting Models
Here's a breakdown of the forecasting models currently implemented in Darts. Our suite includes both regression and classification models, each tailored for specific forecasting tasks. We are committed to expanding our offerings with new models and features to enhance your forecasting capabilities.
Regression Models: Our regression models are designed to predict continuous numerical values, making them ideal for forecasting future trends and patterns in time series data. Utilize these models to gain insights into potential future outcomes based on historical data.
| Model | Sources | Target Series Support:
Univariate/
Multivariate | Covariates Support:
Past-observed/
Future-known/
Static | Probabilistic Forecasting:
Sampled/
Distribution Parameters | Training & Forecasting on Multiple Series |
|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------|--------------------------------------------------------------------------|--------------------------------------------------------------------------|-------------------------------------------|
| Baseline Models
(LocalForecastingModel) | | | | | |
| NaiveMean | | ✅ ✅ | 🔴 🔴 🔴 | 🔴 🔴 | 🔴 |
| NaiveSeasonal | | ✅ ✅ | 🔴 🔴 🔴 | 🔴 🔴 | 🔴 |
| NaiveDrift | | ✅ ✅ | 🔴 🔴 🔴 | 🔴 🔴 | 🔴 |
| NaiveMovingAverage | | ✅ ✅ | 🔴 🔴 🔴 | 🔴 🔴 | 🔴 |
| Statistical / Classic Models
(LocalForecastingModel) | | | | | |
| ARIMA | | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ 🔴 | 🔴 |
| VARIMA | | 🔴 ✅ | 🔴 ✅ 🔴 | ✅ 🔴 | 🔴 |
| ExponentialSmoothing | | ✅ 🔴 | 🔴 🔴 🔴 | ✅ 🔴 | 🔴 |
| Theta and FourTheta | Theta & 4 Theta | ✅ 🔴 | 🔴 🔴 🔴 | 🔴 🔴 | 🔴 |
| Prophet | Prophet repo | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ 🔴 | 🔴 |
| FFT (Fast Fourier Transform) | | ✅ 🔴 | 🔴 🔴 🔴 | 🔴 🔴 | 🔴 |
| KalmanForecaster using the Kalman filter and N4SID for system identification | N4SID paper | ✅ ✅ | 🔴 ✅ 🔴 | ✅ 🔴 | 🔴 |
| TBATS | TBATS paper | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| Croston method | | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| StatsForecastModel wrapper around any StatsForecast model | Nixtla's statsforecast | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| AutoARIMA | Nixtla's statsforecast | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| AutoETS | Nixtla's statsforecast | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| AutoCES | Nixtla's statsforecast | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| AutoMFLES | Nixtla's statsforecast | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| AutoTBATS | Nixtla's statsforecast | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| AutoTheta | Nixtla's statsforecast | ✅ 🔴 | 🔴 ✅ 🔴 | ✅ ✅ | 🔴 |
| Global Baseline Models
(GlobalForecastingModel) | | | | | |
| GlobalNaiveAggregate | | ✅ ✅ | 🔴 🔴 🔴 | 🔴 🔴 | ✅ |
| GlobalNaiveDrift | | ✅ ✅ | 🔴 🔴 🔴 | 🔴 🔴 | ✅ |
| GlobalNaiveSeasonal | | ✅ ✅ | 🔴 🔴 🔴 | 🔴 🔴 | ✅ |
| Regression Models
(GlobalForecastingModel) | | | | | |
| SKLearnModel: wrapper around any scikit-learn-like regression model | | ✅ ✅ | ✅ ✅ ✅ | 🔴 🔴 | ✅ |
| LinearRegressionModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| RandomForestModel | | ✅ ✅ | ✅ ✅ ✅ | 🔴 🔴 | ✅ |
| CatBoostModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| LightGBMModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| XGBModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| PyTorch (Lightning)-based Models
(GlobalForecastingModel) | | | | | |
| RNNModel (incl. LSTM and GRU); equivalent to DeepAR in its probabilistic version | DeepAR paper | ✅ ✅ | 🔴 ✅ 🔴 | ✅ ✅ | ✅ |
| BlockRNNModel (incl. LSTM and GRU) | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| NBEATSModel | N-BEATS paper | ✅ ✅ | ✅ 🔴 🔴 | ✅ ✅ | ✅ |
| NHiTSModel | N-HiTS paper | ✅ ✅ | ✅ 🔴 🔴 | ✅ ✅ | ✅ |
| TCNModel | TCN paper, DeepTCN paper, blog post | ✅ ✅ | ✅ 🔴 🔴 | ✅ ✅ | ✅ |
| TransformerModel | | ✅ ✅ | ✅ 🔴 🔴 | ✅ ✅ | ✅ |
| TFTModel (Temporal Fusion Transformer) | TFT paper, PyTorch Forecasting | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| DLinearModel | DLinear paper | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| NLinearModel | NLinear paper | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| TiDEModel | TiDE paper | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| TSMixerModel | TSMixer paper, PyTorch Implementation | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| Ensemble Models
(GlobalForecastingModel): Model support is dependent on ensembled forecasting models and the ensemble model itself | | | | | |
| NaiveEnsembleModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| RegressionEnsembleModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| Conformal Models
(GlobalForecastingModel): Model support is dependent on the forecasting model used | | | | | |
| ConformalNaiveModel | Conformalized Prediction | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| ConformalQRModel | Conformalized Quantile Regression | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
Classification Models: Classification models in Darts are designed to predict categorical class labels, enabling effective time series labeling and future class prediction. These models are perfect for scenarios where identifying distinct categories or states over time is crucial.
| Model | Sources | Target Series Support:
Univariate/
Multivariate | Covariates Support:
Past-observed/
Future-known/
Static | Probabilistic Forecasting:
Sampled/
Distribution Parameters | Training & Forecasting on Multiple Series |
|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------|--------------------------------------------------------------|--------------------------------------------------------------------------|--------------------------------------------------------------------------|-------------------------------------------|
| Regression Models
(GlobalForecastingModel) | | | | | |
| SKLearnClassifierModel: wrapper around any scikit-learn-like classification model | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| CatBoostClassifierModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| LightGBMClassifierModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
| XGBClassifierModel | | ✅ ✅ | ✅ ✅ ✅ | ✅ ✅ | ✅ |
Community & Contact
Anyone is welcome to join our Gitter room to ask questions, make proposals, discuss use-cases, and more. If you spot a bug or have suggestions, GitHub issues are also welcome.
If what you want to tell us is not suitable for Gitter or Github, feel free to send us an email at darts@unit8.co for darts related matters or info@unit8.co for any other inquiries.
Contribute
The development is ongoing, and we welcome suggestions, pull requests and issues on GitHub. All contributors will be acknowledged on the change log page.
Before working on a contribution (a new feature or a fix), check our contribution guidelines.
Citation
If you are using Darts in your scientific work, we would appreciate citations to the following JMLR paper.
Darts: User-Friendly Modern Machine Learning for Time Series
Bibtex entry:
@article{JMLR:v23:21-1177,
author = {Julien Herzen and Francesco Lässig and Samuele Giuliano Piazzetta and Thomas Neuer and Léo Tafti and Guillaume Raille and Tomas Van Pottelbergh and Marek Pasieka and Andrzej Skrodzki and Nicolas Huguenin and Maxime Dumonal and Jan Kościsz and Dennis Bader and Frédérick Gusset and Mounir Benheddi and Camila Williamson and Michal Kosinski and Matej Petrik and Gaël Grosch},
title = {Darts: User-Friendly Modern Machine Learning for Time Series},
journal = {Journal of Machine Learning Research},
year = {2022},
volume = {23},
number = {124},
pages = {1-6},
url = {http://jmlr.org/papers/v23/21-1177.html}
}
Owner
- Name: Unit8 SA
- Login: unit8co
- Kind: organization
- Email: contact@unit8.com
- Website: https://unit8.com/
- Repositories: 12
- Profile: https://github.com/unit8co
Solving your most impactful problems via Big Data & AI
Committers
Last synced: 5 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Dennis Bader | d****r@g****h | 303 |
| Julien Herzen | j****n@u****o | 188 |
| Julien Herzen | j****n@u****o | 123 |
| madtoinou | 3****u@u****m | 71 |
| Francesco Lässig | 4****c@u****m | 65 |
| Andrzej Skrodzki | e****r@u****m | 28 |
| hrzn | h****n@u****m | 27 |
| Tomas Van Pottelbergh | 8****h@u****m | 22 |
| Tafti Léo | 4****i@u****m | 20 |
| dennisbader | d****r@u****m | 17 |
| Greg DeVos | g****0@g****m | 14 |
| Invernizzi Hakim | h****i@g****m | 12 |
| Dustin Brunner | 9****u@u****m | 11 |
| Samuele Giuliano Piazzetta | s****a@g****m | 10 |
| cnhwl | h****2@1****m | 10 |
| camilaagw | c****w@g****m | 10 |
| Maxime Dumonal | d****x@g****m | 10 |
| Timon Erhart | 5****n@u****m | 10 |
| Kamil | 3****5@u****m | 9 |
| Guillaume | 6****e@u****m | 9 |
| Marek Pasieka | m****a@g****m | 8 |
| gian | 9****r@u****m | 8 |
| dependabot[bot] | 4****]@u****m | 8 |
| Michal Kosinski | m****i@u****o | 8 |
| Jan | 5****i@u****m | 8 |
| Felix Divo | 4****o@u****m | 8 |
| DavidKleindienst | 7****t@u****m | 8 |
| Thomas Neuer | n****m@g****m | 7 |
| eliane-maalouf | 1****f@u****m | 7 |
| Rijk van der Meulen | 8****n@u****m | 7 |
| and 124 more... | ||
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 5 months ago
All Time
- Total issues: 815
- Total pull requests: 641
- Average time to close issues: 4 months
- Average time to close pull requests: 17 days
- Total issue authors: 454
- Total pull request authors: 86
- Average comments per issue: 2.21
- Average comments per pull request: 1.55
- Merged pull requests: 502
- Bot issues: 0
- Bot pull requests: 14
Past Year
- Issues: 205
- Pull requests: 261
- Average time to close issues: 11 days
- Average time to close pull requests: 9 days
- Issue authors: 129
- Pull request authors: 26
- Average comments per issue: 1.41
- Average comments per pull request: 1.2
- Merged pull requests: 196
- Bot issues: 0
- Bot pull requests: 6
Top Authors
Issue Authors
- ETTAN93 (31)
- dennisbader (26)
- hrzn (23)
- Allena101 (17)
- eschibli (12)
- turbotimon (11)
- andrew20012656 (11)
- dwolffram (10)
- guilhermeparreira (9)
- tRosenflanz (7)
- Jonathan-87 (7)
- NQevxvEtg (6)
- giacomoguiduzzi (6)
- SaltedfishLZX (6)
- hberande (5)
Pull Request Authors
- dennisbader (276)
- madtoinou (75)
- cnhwl (24)
- jonasblanc (18)
- quant12345 (16)
- turbotimon (15)
- authierj (15)
- dependabot[bot] (14)
- Borda (14)
- felixdivo (7)
- JanFidor (7)
- ymatzkevich (6)
- BohdanBilonoh (5)
- SimTheGreat (5)
- hrzn (5)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 5
-
Total downloads:
- pypi 221,149 last-month
- Total docker downloads: 1,220
-
Total dependent packages: 22
(may contain duplicates) -
Total dependent repositories: 79
(may contain duplicates) - Total versions: 146
- Total maintainers: 1
pypi.org: darts
A python library for easy manipulation and forecasting of time series.
- Homepage: https://unit8co.github.io/darts/
- Documentation: https://unit8co.github.io/darts/
- License: Apache License 2.0
-
Latest release: 0.37.1
published 6 months ago
Rankings
Maintainers (1)
pypi.org: u8darts
A python library for easy manipulation and forecasting of time series.
- Homepage: https://unit8co.github.io/darts/
- Documentation: https://unit8co.github.io/darts/
- License: Apache License 2.0
-
Latest release: 0.37.1
published 6 months ago
Rankings
Maintainers (1)
conda-forge.org: u8darts
- Homepage: https://unit8co.github.io/darts/
- License: Apache-2.0
-
Latest release: 0.22.0
published over 3 years ago
Rankings
conda-forge.org: u8darts-all
- Homepage: https://unit8co.github.io/darts/
- License: Apache-2.0
-
Latest release: 0.22.0
published over 3 years ago
Rankings
conda-forge.org: u8darts-torch
- Homepage: https://unit8co.github.io/darts/
- License: Apache-2.0
-
Latest release: 0.22.0
published over 3 years ago
Rankings
Dependencies
- conda-build
- conda-verify
- python >=3.7
- actions/cache v1 composite
- actions/cache v2 composite
- actions/checkout v2 composite
- actions/setup-python v1 composite
- codecov/codecov-action v2 composite
- actions/cache v1 composite
- actions/cache v2 composite
- actions/checkout v2 composite
- actions/setup-python v1 composite
- s0/git-publish-subdir-action v2.2.0 composite
- actions/cache v1 composite
- actions/cache v2 composite
- actions/checkout v2 composite
- actions/setup-python v1 composite
- codecov/codecov-action v2 composite
- actions/cache v2 composite
- actions/cache v1 composite
- actions/checkout v2 composite
- actions/create-release latest composite
- actions/setup-python v1 composite
- hrzn/github-tag-action master composite
- s0/git-publish-subdir-action v2.2.0 composite
- stefanzweifel/git-auto-commit-action v4.1.6 composite
- jupyter/base-notebook python-3.9.5 build
- catboost >=1.0.6,<1.2.0
- holidays >=0.11.1
- joblib >=0.16.0
- lightgbm >=3.2.0
- matplotlib >=3.3.0
- nfoursid >=1.0.0
- numpy >=1.19.0
- pandas >=1.0.5,<2.0.0
- pandas >=1.0.5
- pmdarima >=1.8.0
- prophet >=1.1.1
- pyod >=0.9.5
- requests >=2.22.0
- scikit-learn >=1.0.1
- scipy >=1.3.2
- shap >=0.40.0
- statsforecast >=1.4
- statsmodels >=0.14.0
- tbats >=1.1.0
- tqdm >=4.60.0
- typing-extensions *
- xarray >=0.17.0
- xgboost >=1.6.0
- black ==22.3.0 development
- flake8 ==4.0.1 development
- isort ==5.11.5 development
- pre-commit * development
- pytest-cov * development
- pyupgrade ==2.31.0 development
- testfixtures * development
- bump2version ==1.0.1
- docutils ==0.17.1
- ipykernel ==5.3.4
- ipython ==8.10.0
- ipywidgets ==7.5.1
- jinja2 ==3.0.3
- m2r2 ==0.3.2
- nbsphinx ==0.8.7
- numpydoc ==1.1.0
- papermill ==2.2.2
- pydata-sphinx-theme ==0.7.2
- recommonmark ==0.7.1
- sphinx ==4.3.2
- sphinx-automodapi ==0.14.0
- sphinx_autodoc_typehints ==1.12.0
- twine ==3.3.0
- pytorch-lightning >=1.5.0
- tensorboardX >=2.1
- torch >=1.8.0