https://github.com/dmetivie/fluxarchitectures.jl
Complex neural network examples for Flux.jl
Science Score: 10.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
○codemeta.json file
-
○.zenodo.json file
-
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (9.9%) to scientific vocabulary
Last synced: 5 months ago
·
JSON representation
Repository
Complex neural network examples for Flux.jl
Basic Info
- Host: GitHub
- Owner: dmetivie
- License: mit
- Language: Julia
- Default Branch: master
- Size: 44.3 MB
Statistics
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
- Releases: 0
Fork of sdobber/FluxArchitectures.jl
Created almost 2 years ago
· Last pushed almost 2 years ago
https://github.com/dmetivie/FluxArchitectures.jl/blob/master/
# FluxArchitectures [](https://sdobber.github.io/FluxArchitectures.jl/dev) [](https://github.com/sdobber/FluxArchitectures.jl/actions) [](https://codecov.io/gh/sdobber/FluxArchitectures.jl) Complex neural network examples for Flux.jl. This package contains a loose collection of (slightly) more advanced neural network architectures, mostly centered around time series forecasting. ## Installation To install FluxArchitectures, type `]` to activate the package manager, and type ```julia add FluxArchitectures ``` for installation. After `using FluxArchitectures`, the following functions are exported: * `prepare_data` * `get_data` * `DARNN` * `DSANet` * `LSTnet` * `TPALSTM` See their docstrings, the [documentation]((https://sdobber.github.io/FluxArchitectures.jl/stable)), and the `examples` folder for details. ## Models * **LSTnet**: This "Long- and Short-term Time-series network" follows the paper by [Lai et. al.](https://arxiv.org/abs/1703.07015). * **DARNN**: The "Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction" is based on the paper by [Qin et. al.](https://arxiv.org/abs/1704.02971). * **TPA-LSTM**: The Temporal Pattern Attention LSTM network is based on the paper "Temporal Pattern Attention for Multivariate Time Series Forecasting" by [Shih et. al.](https://arxiv.org/pdf/1809.04206v2.pdf). * **DSANet**: The "Dual Self-Attention Network for Multivariate Time Series Forecasting" is based on the paper by [Siteng Huang et. al.](https://kyonhuang.top/files/Huang-DSANet.pdf) ## Quickstart Activate the package and load some sample-data: ```julia using FluxArchitectures poollength = 10; horizon = 15; datalength = 1000; input, target = get_data(:exchange_rate, poollength, datalength, horizon) ``` Define a model and a loss function: ```julia model = LSTnet(size(input, 1), 2, 3, poollength, 120) loss(x, y) = Flux.mse(model(x), y') ``` Train the model: ```julia Flux.train!(loss, Flux.params(model),Iterators.repeated((input, target), 20), Adam(0.01)) ```
Owner
- Name: David Métivier
- Login: dmetivie
- Kind: user
- Location: Montpellier, France
- Company: INRAe, MISTEA
- Website: http://www.cmap.polytechnique.fr/~david.metivier/
- Repositories: 5
- Profile: https://github.com/dmetivie
I am a research scientist with a physics background. Now, I do statistics to tackle environmental, and climate change problems. Julia enthusiast!