Recent Releases of https://github.com/sktime/pytorch-forecasting
https://github.com/sktime/pytorch-forecasting - v1.4.0
What's Changed
Feature and maintenance update.
New Contributors
- @gbilleyPeco made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1750
- @pietsjoh made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1399
- @MartinoMensio made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1579
- @phoeenniixx made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1811
- @cngmid made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1827
- @Marcrb2 made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1518
- @jobs-git made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1864
All Contributors
@agobbifbk, @Borda, @cngmid, @fkiraly, @fnhirwa, @gbilleyPeco, @jobs-git, @Marcrb2, @MartinoMensio, @phoeenniixx, @pietsjoh, @PranavBhatP
Full Changelog: https://github.com/sktime/pytorch-forecasting/compare/v1.3.0...v1.4.0
- Python
Published by fkiraly 9 months ago
https://github.com/sktime/pytorch-forecasting - v1.3.0
What's Changed
Feature and maintenance update.
python 3.13supporttidemodel- bugfixes for TFT
New Contributors
- @xiaokongkong made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1719
- @madprogramer made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1720
- @julian-fong made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1705
- @Sohaib-Ahmed21 made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1734
- @d-schmitt made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1580
- @Luke-Chesley made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1516
- @PranavBhatP made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1762
All Contributors
@d-schmitt, @fkiraly, @fnhirwa, @julian-fong, @Luke-Chesley, @madprogramer, @PranavBhatP, @Sohaib-Ahmed21, @xiaokongkong, @XinyuWuu
Full Changelog: https://github.com/sktime/pytorch-forecasting/compare/v1.2.0...v1.3.0
- Python
Published by fkiraly about 1 year ago
https://github.com/sktime/pytorch-forecasting - v1.2.0
What's Changed
Maintenance update, minor feature additions and bugfixes.
- support for
numpy 2.X - end of life for
python 3.8 - fixed documentation build
- bugfixes
New Contributors
- @ewth made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1696
- @airookie17 made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1692
- @benHeid made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1704
- @eugenio-mercuriali made their first contribution in https://github.com/sktime/pytorch-forecasting/pull/1699
All Contributors
@airookie17, @benHeid, @eugenio-mercuriali, @ewth, @fkiraly, @fnhirwa, @XinyuWuu, @yarnabrina
Full Changelog: https://github.com/sktime/pytorch-forecasting/compare/v1.1.1...v1.2.0
- Python
Published by fkiraly over 1 year ago
https://github.com/sktime/pytorch-forecasting - v1.1.1
What's Changed
Hotfix release to correct typo in package name in pyproject.toml, to correct pytorch-forecasting PEP 440 identifier.
Otherwise identical with 1.1.0
Full Changelog: https://github.com/jdb78/pytorch-forecasting/compare/v1.1.0...v1.1.0
- Python
Published by fkiraly over 1 year ago
https://github.com/sktime/pytorch-forecasting - v1.1.0
What's Changed
Maintenance update widening compatibility ranges and consolidating dependencies:
- support for python 3.11 and 3.12, added CI testing
- support for MacOS, added CI testing
- core dependencies have been minimized to
numpy,torch,lightning,scipy,pandas, andscikit-learn. - soft dependencies are available in soft dependency sets:
all_extrasfor all soft dependencies, andtuningforoptunabased optimization.
Dependency changes
- the following are no longer core dependencies and have been changed to optional dependencies :
optuna,statsmodels,pytorch-optimize,matplotlib. Environments relying on functionality requiring these dependencies need to be updated to install these explicitly. optunabounds have been updated tooptuna >=3.1.0,<4.0.0optuna-integrateis now an additional soft dependency, in case ofoptuna >=3.3.0
Deprecations and removals
- from 1.2.0, the default optimizer will be changed from
"ranger"to"adam"to avoid non-torchdependencies in defaults.pytorch-optimizeoptimizers can still be used. Users should set the optimizer explicitly to continue using"ranger". - from 1.1.0, the loggers do not log figures if soft dependency
matplotlibis not present, but will raise no exceptions in this case. To log figures, ensure thatmatplotlibis installed.
All Contributors
@andre-marcos-perez, @avirsaha, @bendavidsteel, @benheid, @bohdan-safoniuk, @Borda, @CahidArda, @fkiraly, @fnhirwa, @germanKoch, @jacktang, @jdb78, @jurgispods, @maartensukel, @MBelniak, @orangehe, @pavelzw, @sfalkena, @tmct, @XinyuWuu, @yarnabrina,
New Contributors
- @jurgispods made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1366
- @jacktang made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1353
- @andre-marcos-perez made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1346
- @tmct made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1340
- @bohdan-safoniuk made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1318
- @MBelniak made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1230
- @CahidArda made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1175
- @bendavidsteel made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1359
- @Borda made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1498
- @fkiraly made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1598
- @XinyuWuu made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1599
- @pavelzw made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1407
- @yarnabrina made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1630
- @fnhirwa made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1646
- @avirsaha made their first contribution in https://github.com/jdb78/pytorch-forecasting/pull/1649
Full Changelog: https://github.com/jdb78/pytorch-forecasting/compare/v1.0.0...v1.1.0
- Python
Published by fkiraly over 1 year ago
https://github.com/sktime/pytorch-forecasting - Update to pytorch 2.0
Breaking Changes
- Upgraded to pytorch 2.0 and lightning 2.0. This brings a couple of changes, such as configuration of trainers. See the lightning upgrade guide. For PyTorch Forecasting, this particularly means if you are developing own models, the class method
epoch_endhas been renamed toon_epoch_endand replacingmodel.summarize()withModelSummary(model, max_depth=-1)andTuner(trainer)is its own class, sotrainer.tunerneeds replacing. (#1280) - Changed the
predict()interface returning named tuple - see tutorials.
Changes
- The predict method is now using the lightning predict functionality and allows writing results to disk (#1280).
Fixed
- Fixed robust scaler when quantiles are 0.0, and 1.0, i.e. minimum and maximum (#1142)
- Python
Published by jdb78 almost 3 years ago
https://github.com/sktime/pytorch-forecasting - Poetry update
Fixed
- Removed pandoc from dependencies as issue with poetry install (#1126)
- Added metric attributes for torchmetric resulting in better multi-GPU performance (#1126)
Added
- "robust" encoder method can be customized by setting "center", "lower" and "upper" quantiles (#1126)
- Python
Published by jdb78 over 3 years ago
https://github.com/sktime/pytorch-forecasting - Multivariate networks
Added
- DeepVar network (#923)
- Enable quantile loss for N-HiTS (#926)
- MQF2 loss (multivariate quantile loss) (#949)
- Non-causal attention for TFT (#949)
- Tweedie loss (#949)
- ImplicitQuantileNetworkDistributionLoss (#995)
Fixed
- Fix learning scale schedule (#912)
- Fix TFT list/tuple issue at interpretation (#924)
- Allowed encoder length down to zero for EncoderNormalizer if transformation is not needed (#949)
- Fix Aggregation and CompositeMetric resets (#949)
Changed
- Dropping Python 3.6 suppport, adding 3.10 support (#479)
- Refactored dataloader sampling - moved samplers to pytorch_forecasting.data.samplers module (#479)
- Changed transformation format for Encoders to dict from tuple (#949)
Contributors
- jdb78
- Python
Published by jdb78 almost 4 years ago
https://github.com/sktime/pytorch-forecasting - Bugfixes
Fixed
- Fix with creating tensors on correct devices (#908)
- Fix with MultiLoss when calculating gradient (#908)
Contributors
- jdb78
- Python
Published by jdb78 almost 4 years ago
https://github.com/sktime/pytorch-forecasting - Adding N-HiTS network (N-BEATS successor)
Added
- Added new
N-HiTSnetwork that has consistently beatenN-BEATS(#890) - Allow using torchmetrics as loss metrics (#776)
- Enable fitting
EncoderNormalizer()with limited data history usingmax_lengthargument (#782) - More flexible
MultiEmbedding()with convenienceoutput_sizeandinput_sizeproperties (#829) - Fix concatentation of attention (#902)
Fixed
- Fix pip install via github (#798)
Contributors
- jdb78
- christy
- lukemerrick
- Seon82
- Python
Published by jdb78 almost 4 years ago
https://github.com/sktime/pytorch-forecasting - Maintenance Release
Added
- Added support for running
pytorch_lightning.trainer.test(#759)
Fixed
- Fix inattention mutation to
x_cont(#732). - Compatability with pytorch-lightning 1.5 (#758)
Contributors
- eavae
- danielgafni
- jdb78
- Python
Published by jdb78 over 4 years ago
https://github.com/sktime/pytorch-forecasting - Maintenance Release (26/09/2021)
Added
- Use target name instead of target number for logging metrics (#588)
- Optimizer can be initialized by passing string, class or function (#602)
- Add support for multiple outputs in Baseline model (#603)
- Added Optuna pruner as optional parameter in
TemporalFusionTransformer.optimize_hyperparameters(#619) - Dropping support for Python 3.6 and starting support for Python 3.9 (#639)
Fixed
- Initialization of TemporalFusionTransformer with multiple targets but loss for only one target (#550)
- Added missing transformation of prediction for MLP (#602)
- Fixed logging hyperparameters (#688)
- Ensure MultiNormalizer fit state is detected (#681)
- Fix infinite loop in TimeDistributedEmbeddingBag (#672)
Contributors
- jdb78
- TKlerx
- chefPony
- eavae
- L0Z1K
- Python
Published by jdb78 over 4 years ago
https://github.com/sktime/pytorch-forecasting - Simplified API
Breaking changes
- Removed
dropout_categoricalsparameter fromTimeSeriesDataSet. Usecategorical_encoders=dict(<variable_name>=NaNLabelEncoder(add_nan=True)) instead (#518) - Rename parameter
allow_missingsforTimeSeriesDataSettoallow_missing_timesteps(#518) Transparent handling of transformations. Forward methods should now call two new methods (#518):
transform_outputto explicitly rescale the network outputs into the de-normalized spaceto_network_outputto create a dict-like named tuple. This allows tracing the modules with PyTorch's JIT. Onlypredictionis still required which is the main network output.
Example:
python
def forward(self, x):
normalized_prediction = self.module(x)
prediction = self.transform_output(prediction=normalized_prediction, target_scale=x["target_scale"])
return self.to_network_output(prediction=prediction)
Added
- Improved validation of input parameters of TimeSeriesDataSet (#518)
Fixed
- Fix quantile prediction for tensors on GPUs for distribution losses (#491)
- Fix hyperparameter update for RecurrentNetwork.from_dataset method (#497)
- Python
Published by jdb78 over 4 years ago
https://github.com/sktime/pytorch-forecasting - Generic distribution loss(es)
Added
- Allow lists for multiple losses and normalizers (#405)
- Warn if normalization is with scale
< 1e-7(#429) - Allow usage of distribution losses in all settings (#434)
Fixed
- Fix issue when predicting and data is on different devices (#402)
- Fix non-iterable output (#404)
- Fix problem with moving data to CPU for multiple targets (#434)
Contributors
- jdb78
- domplexity
- Python
Published by jdb78 almost 5 years ago
https://github.com/sktime/pytorch-forecasting - Simple models
Added
- Adding a filter functionality to the timeseries datasset (#329)
- Add simple models such as LSTM, GRU and a MLP on the decoder (#380)
- Allow usage of any torch optimizer such as SGD (#380)
Fixed
- Moving predictions to CPU to avoid running out of memory (#329)
- Correct determination of
output_sizefor multi-target forecasting with the TemporalFusionTransformer (#328) - Tqdm autonotebook fix to work outside of Jupyter (#338)
- Fix issue with yaml serialization for TensorboardLogger (#379)
Contributors
- jdb78
- JakeForsey
- vakker
- Python
Published by jdb78 almost 5 years ago
https://github.com/sktime/pytorch-forecasting - Bugfix release
Added
- Make tuning trainer kwargs overwritable (#300)
- Allow adding categories to NaNEncoder (#303)
Fixed
- Underlying data is copied if modified. Original data is not modified inplace (#263)
- Allow plotting of interpretation on passed figure for NBEATS (#280)
- Fix memory leak for plotting and logging interpretation (#311)
- Correct shape of
predict()method output for multi-targets (#268) - Remove cloudpickle to allow GPU trained models to be loaded on CPU devices from checkpoints (#314)
Contributors
- jdb78
- kigawas
- snumumrik
- Python
Published by jdb78 about 5 years ago
https://github.com/sktime/pytorch-forecasting - Fix for output transformer
- Added missing output transformation which was switched off by default (#260)
- Python
Published by jdb78 about 5 years ago
https://github.com/sktime/pytorch-forecasting - Adding support for lag variables
Added
- Add "Release Notes" section to docs (#237)
- Enable usage of lag variables for any model (#252)
Changed
- Require PyTorch>=1.7 (#245)
Fixed
- Fix issue for multi-target forecasting when decoder length varies in single batch (#249)
- Enable longer subsequences for minpredictionidx that were previously wrongfully excluded (#250)
Contributors
- jdb78
- Python
Published by jdb78 about 5 years ago
https://github.com/sktime/pytorch-forecasting - Adding multi-target support
Added
- Adding support for multiple targets in the TimeSeriesDataSet (#199) and amended tutorials.
- Temporal fusion transformer and DeepAR with support for multiple tagets (#199)
- Check for non-finite values in TimeSeriesDataSet and better validate scaler argument (#220)
- LSTM and GRU implementations that can handle zero-length sequences (#235)
- Helpers for implementing auto-regressive models (#236)
Changed
- TimeSeriesDataSet's
yof the dataloader is a tuple of (target(s), weight) - potentially breaking for model or metrics implementation Most implementations will not be affected as hooks in BaseModel and MultiHorizonMetric were modified.
Fixed
- Fixed autocorrelation for pytorch 1.7 (#220)
- Ensure reproducibility by replacing python
set()withdict.fromkeys()(mostly TimeSeriesDataSet) (#221) - Ensures BetaDistributionLoss does not lead to infinite loss if actuals are 0 or 1 (#233)
- Fix for GroupNormalizer if scaling by group (#223)
- Fix for TimeSeriesDataSet when using
min_prediction_idx(#226)
Contributors
- jdb78
- JustinNeumann
- reumar
- rustyconover
- Python
Published by jdb78 about 5 years ago
https://github.com/sktime/pytorch-forecasting - Tutorial on how to implement a new architecture
Added
- Tutorial on how to implement a new architecture covering basic and advanced use cases (#188)
- Additional and improved documentation - particularly of implementation details (#188)
Changed (breaking for new model implementations)
- Moved multiple private methods to public methods (particularly logging) (#188)
- Moved
get_maskmethod from BaseModel into utils module (#188) - Instead of using label to communicate if model is training or validating, using
self.trainingattribute (#188) - Using
sample((n,))of pytorch distributions instead of deprecatedsample_n(n)method (#188)
- Python
Published by jdb78 about 5 years ago
https://github.com/sktime/pytorch-forecasting - New API for transforming inputs and outputs with encoders
Added
- Beta distribution loss for probabilistic models such as DeepAR (#160)
Changed
- BREAKING: Simplifying how to apply transforms (such as logit or log) before and after applying encoder. Some transformations are included by default but a tuple of a forward and reverse transform function can be passed for arbitrary transformations. This requires to use a
transformationkeyword in target normalizers instead of, e.g.log_scale(#185)
Fixed
- Incorrect target position if
len(static_reals) > 0leading to leakage (#184) - Fixing predicting completely unseen series (#172)
Contributors
- jdb78
- JakeForsey
- Python
Published by jdb78 about 5 years ago
https://github.com/sktime/pytorch-forecasting - Bugfixes and DeepAR improvements
Added
- Using GRU cells with DeepAR (#153)
Fixed
- GPU fix for variable sequence length (#169)
- Fix incorrect syntax for warning when removing series (#167)
- Fix issue when using unknown group ids in validation or test dataset (#172)
- Run non-failing CI on PRs from forks (#166, #156)
Docs
- Improved model selection guidance and explanations on how TimeSeriesDataSet works (#148)
- Clarify how to use with conda (#168)
Contributors
- jdb78
- JakeForsey
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Adding DeepAR
Added * DeepAR by Amazon (#115) * First autoregressive model in PyTorch Forecasting * Distribution loss: normal, negative binomial and log-normal distributions * Currently missing: handling lag variables and tutorial (planned for 0.6.1) * Improved documentation on TimeSeriesDataSet and how to implement a new network (#145)
Changed * Internals of encoders and how they store center and scale (#115)
Fixed * Update to PyTorch 1.7 and PyTorch Lightning 1.0.5 which came with breaking changes for CUDA handling and with optimizers (PyTorch Forecasting Ranger version) (#143, #137, #115)
Contributors * jdb78 * JakeForesey
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Bug fixes
Fixes
- Fix issue where hyperparameter verbosity controlled only part of output (#118)
- Fix occasional error when
.get_parameters()fromTimeSeriesDataSetfailed (#117) - Remove redundant double pass through LSTM for temporal fusion transformer (#125)
- Prevent installation of pytorch-lightning 1.0.4 as it breaks the code (#127)
- Prevent modification of model defaults in-place (#112)
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Fixes to interpretation and more control over hyperparameter verbosity
Added
- Hyperparameter tuning with optuna to tutorial
- Control over verbosity of hyper parameter tuning
Fixes
- Interpretation error when different batches had different maximum decoder lengths
- Fix some typos (no changes to user API)
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - PyTorch Lightning 1.0 compatibility
This release has only one purpose: Allow usage of PyTorch Lightning 1.0 - all tests have passed.
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - PyTorch Lightning 0.10 compatibility and classification
Added
- Additional checks for
TimeSeriesDataSetinputs - now flagging if series are lost due to highmin_encoder_lengthand ensure parameters are integers - Enable classification - simply change the target in the
TimeSeriesDataSetto a non-float variable, use theCrossEntropymetric to optimize and output as many classes as you want to predict
Changed
- Ensured PyTorch Lightning 0.10 compatibility
- Using
LearningRateMonitorinstead ofLearningRateLogger - Use
EarlyStoppingcallback in trainercallbacksinstead ofearly_stoppingargument - Update metric system
update()andcompute()methods - Use
trainer.tuner.lr_find()instead oftrainer.lr_find()in tutorials and examples
- Using
- Update poetry to 1.1.0
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Various fixes models and data
Fixes
Model
- Removed attention to current datapoint in TFT decoder to generalise better over various sequence lengths
- Allow resuming optuna hyperparamter tuning study
Data
- Fixed inconsistent naming and calculation of
encoder_lengthin TimeSeriesDataSet when added as feature
Contributors
- jdb78
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Metrics, performance, and subsequence detection
Added
Models
- Backcast loss for N-BEATS network for better regularisation
- logging_metrics as explicit arguments to models
Metrics
- MASE (Mean absolute scaled error) metric for training and reporting
- Metrics can be composed, e.g.
0.3* metric1 + 0.7 * metric2 - Aggregation metric that is computed on mean prediction over all samples to reduce mean-bias
Data
- Increased speed of parsing data with missing datapoints. About 2s for 1M data points. If
numbais installed, 0.2s for 1M data points - Time-synchronize samples in batches: ensure that all samples in each batch have with same time index in decoder
Breaking changes
- Improved subsequence detection in TimeSeriesDataSet ensures that there exists a subsequence starting and ending on each point in time.
- Fix
min_encoder_length = 0being ignored and processed asmin_encoder_length = max_encoder_length
Contributors
- jdb78
- dehoyosb
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - More tests and better docs
- More tests driving coverage to ~90%
- Performance tweaks for temporal fusion transformer
- Reformatting with sort
- Improve documentation - particularly expand on hyper parameter tuning
Fixes: * Fix PoissonLoss quantiles calculation * Fix N-Beats visualisations
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - More testing and interpretation features
Added
- Calculating partial dependency for a variable
- Improved documentation - in particular added FAQ section and improved tutorial
- Data for examples and tutorials can now be downloaded. Cloning the repo is not a requirement anymore
- Added Ranger Optimizer from
pytorch_rangerpackage and fixed its warnings (part of preparations for conda package release) - Use GPU for tests if available as part of preparation for GPU tests in CI
Changes
- BREAKING: Fix typo “adddecoderlength” to “addencoderlength” in TimeSeriesDataSet
Bugfixes
- Fixing plotting predictions vs actuals by slicing variables
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Fix edge case in prediction logging
Fixes
Fix bug where predictions were not correctly logged in case of decoder_length == 1.
Additions
Add favicon to docs page
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Make pip installable from master branch
Update build system requirements to be parsed correctly when installing with pip install https://github.com/jdb78/pytorch-forecasting/
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Improving tests
- Add tests for MacOS
- Automatic releases
- Coverage reporting
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Patch release
This release improves robustness of the code.
Fixing bug across code, in particularly
- Ensuring that code works on GPUs
- Adding tests for models, dataset and normalisers
- Test using GitHub Actions (tests on GPU are still missing)
Extend documentation by improving docstrings and adding two tutorials.
Improving default arguments for TimeSeriesDataSet to avoid surprises
- Python
Published by jdb78 over 5 years ago
https://github.com/sktime/pytorch-forecasting - Minor release
Minor release
Added
- Basic tests for data and model (mostly integration tests)
- Automatic target normalization
- Improved visualization and logging of temporal fusion transformer
- Model bugfixes and performance improvements for temporal fusion transformer
Modified
- Metrics are reduced to calculating loss. Target transformations are done by new target transformer
- Python
Published by jdb78 over 5 years ago