experimental-deepriver
Science Score: 67.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 7 DOI reference(s) in README -
✓Academic publication links
Links to: joss.theoj.org, zenodo.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (15.4%) to scientific vocabulary
Repository
Basic Info
- Host: GitHub
- Owner: joseEnrique
- License: bsd-3-clause
- Language: Python
- Default Branch: main
- Size: 894 KB
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
deep-river is a Python library for online deep learning. deep-river's ambition is to enable online machine learning for neural networks. It combines the river API with the capabilities of designing neural networks based on PyTorch.
📚 Documentation
The documentation contains an overview of all features of this repository as well as the repository's full features list. In each of these, the git repo reference is listed in a section that shows examples of the features and functionality. As we are always looking for further use cases and examples, feel free to contribute to the documentation or the repository itself via a pull request
💈 Installation
shell
pip install deep-river
or
shell
pip install "river[deep]"
You can install the latest development version from GitHub as so:
shell
pip install https://github.com/online-ml/deep-river/archive/refs/heads/master.zip
🍫 Quickstart
We build the development of neural networks on top of the river API and refer to the rivers design principles. The following example creates a simple MLP architecture based on PyTorch and incrementally predicts and trains on the website phishing dataset. For further examples check out the Documentation.
Classification
```python
from river import metrics, datasets, preprocessing, compose from deepriver import classification from torch import nn from torch import optim from torch import manualseed
_ = manual_seed(42)
class MyModule(nn.Module): ... def init(self, nfeatures): ... super(MyModule, self).init() ... self.dense0 = nn.Linear(nfeatures, 5) ... self.nonlin = nn.ReLU() ... self.dense1 = nn.Linear(5, 2) ... self.softmax = nn.Softmax(dim=-1) ... ... def forward(self, X, **kwargs): ... X = self.nonlin(self.dense0(X)) ... X = self.nonlin(self.dense1(X)) ... X = self.softmax(X) ... return X
modelpipeline = compose.Pipeline( ... preprocessing.StandardScaler(), ... classification.ClassifierInitialized(module=MyModule(10), lossfn='binarycrossentropy', optimizer_fn='adam') ... )
dataset = datasets.Phishing() metric = metrics.Accuracy()
for x, y in dataset: ... ypred = modelpipeline.predictone(x) # make a prediction ... metric.update(y, ypred) # update the metric ... modelpipeline.learnone(x, y) # make the model learn print(f"Accuracy: {metric.get():.4f}") Accuracy: 0.7264
```
Multi Target Regression
```python
from river import evaluate, compose from river import metrics from river import preprocessing from river import stream from sklearn import datasets from torch import nn from deep_river.regression.multioutput import MultiTargetRegressorInitialized
class MyModule(nn.Module): ... def init(self, nfeatures): ... super(MyModule, self).init() ... self.dense0 = nn.Linear(nfeatures, 3) ... ... def forward(self, X, **kwargs): ... X = self.dense0(X) ... return X
dataset = stream.itersklearndataset( ... dataset=datasets.loadlinnerud(), ... shuffle=True, ... seed=42 ... ) model = compose.Pipeline( ... preprocessing.StandardScaler(), ... MultiTargetRegressorInitialized( ... module=MyModule(10), ... lossfn='mse', ... lr=0.3, ... optimizerfn='sgd', ... )) metric = metrics.multioutput.MicroAverage(metrics.MAE()) ev = evaluate.progressiveval_score(dataset, model, metric) print(f"MicroAverage(MAE): {metric.get():.2f}") MicroAverage(MAE): 34.31
```
Anomaly Detection
```python
from deep_river.anomaly import AutoencoderInitialized from river import metrics from river.datasets import CreditCard from torch import nn import math from river.compose import Pipeline from river.preprocessing import MinMaxScaler
dataset = CreditCard().take(5000) metric = metrics.RollingROCAUC(window_size=5000)
class MyAutoEncoder(nn.Module): ... def init(self, nfeatures, latentdim=3): ... super(MyAutoEncoder, self).init() ... self.linear1 = nn.Linear(nfeatures, latentdim) ... self.nonlin = nn.LeakyReLU() ... self.linear2 = nn.Linear(latentdim, nfeatures) ... self.sigmoid = nn.Sigmoid() ... ... def forward(self, X, **kwargs): ... X = self.linear1(X) ... X = self.nonlin(X) ... X = self.linear2(X) ... return self.sigmoid(X)
ae = AutoencoderInitialized(module=MyAutoEncoder(10), lr=0.005) scaler = MinMaxScaler() model = Pipeline(scaler, ae)
for x, y in dataset: ... score = model.scoreone(x) ... model.learnone(x=x) ... metric.update(y, score) ... print(f"Rolling ROCAUC: {metric.get():.4f}") Rolling ROCAUC: 0.8901
```
💬 Citation
To acknowledge the use of the DeepRiver library in your research, please refer to our paper published on Journal of Open Source Software (JOSS):
bibtex
@article{Kulbach2025,
doi = {10.21105/joss.07226},
url = {https://doi.org/10.21105/joss.07226},
year = {2025},
publisher = {The Open Journal},
volume = {10},
number = {105},
pages = {7226},
author = {Cedric Kulbach and Lucas Cazzonelli and Hoang-Anh Ngo and Max Halford and Saulo Martiello Mastelini},
title = {DeepRiver: A Deep Learning Library for Data Streams},
journal = {Journal of Open Source Software}
}
🏫 Affiliations
Owner
- Name: Jose Enrique Ruiz Navarro
- Login: joseEnrique
- Kind: user
- Location: Seville
- Website: http://quiqueruiz.com
- Repositories: 8
- Profile: https://github.com/joseEnrique
Citation (CITATION.cff)
cff-version: "1.2.0"
authors:
- family-names: Kulbach
given-names: Cedric
orcid: "https://orcid.org/0000-0002-9363-4728"
- family-names: Cazzonelli
given-names: Lucas
orcid: "https://orcid.org/0000-0003-2886-1219"
- family-names: Ngo
given-names: Hoang-Anh
orcid: "https://orcid.org/0000-0002-7583-753X"
- family-names: Halford
given-names: Max
orcid: "https://orcid.org/0000-0003-1464-4520"
- family-names: Mastelini
given-names: Saulo Martiello
orcid: "https://orcid.org/0000-0002-0092-3572"
contact:
- family-names: Kulbach
given-names: Cedric
orcid: "https://orcid.org/0000-0002-9363-4728"
- family-names: Ngo
given-names: Hoang-Anh
orcid: "https://orcid.org/0000-0002-7583-753X"
doi: 10.5281/zenodo.14601979
message: If you use this software, please cite our article in the
Journal of Open Source Software.
preferred-citation:
authors:
- family-names: Kulbach
given-names: Cedric
orcid: "https://orcid.org/0000-0002-9363-4728"
- family-names: Cazzonelli
given-names: Lucas
orcid: "https://orcid.org/0000-0003-2886-1219"
- family-names: Ngo
given-names: Hoang-Anh
orcid: "https://orcid.org/0000-0002-7583-753X"
- family-names: Halford
given-names: Max
orcid: "https://orcid.org/0000-0003-1464-4520"
- family-names: Mastelini
given-names: Saulo Martiello
orcid: "https://orcid.org/0000-0002-0092-3572"
date-published: 2025-01-06
doi: 10.21105/joss.07226
issn: 2475-9066
issue: 105
journal: Journal of Open Source Software
publisher:
name: Open Journals
start: 7226
title: "DeepRiver: A Deep Learning Library for Data Streams"
type: article
url: "https://joss.theoj.org/papers/10.21105/joss.07226"
volume: 10
title: "DeepRiver: A Deep Learning Library for Data Streams"
GitHub Events
Total
- Push event: 1
- Create event: 2
Last Year
- Push event: 1
- Create event: 2
Dependencies
- actions/cache v3 composite
- actions/checkout v2 composite
- snok/install-poetry v1 composite
- actions/checkout v4 composite
- actions/setup-python v4 composite
- pypa/gh-action-pypi-publish 27b31702a0e7fc50959f5ad993c78deac1bdfc29 composite
- snok/install-poetry v1 composite
- actions/cache v3 composite
- actions/checkout v4 composite
- actions/setup-python v2 composite
- codecov/codecov-action v2 composite
- snok/install-poetry v1 composite
- 252 dependencies
- black >=24.8.0 develop
- codecov >=2.1.13 develop
- dominate * develop
- flake8 >=7.1.1 develop
- flask >=3.0.2 develop
- graphviz >=0.20.3 develop
- ipykernel >=6.9.0 develop
- ipython_genutils >=0.1.0 develop
- isort >=5.13.2 develop
- jinja2 >=3.0.3 develop
- jupyter >=1.0.0 develop
- jupyter-client * develop
- jupyter_contrib_nbextensions 0.7.0 develop
- matplotlib >=3.9.2 develop
- mike >=0.5.3 develop
- mkdocs >=1.2.3 develop
- mkdocs-awesome-pages-plugin >=2.7.0 develop
- mkdocs-charts-plugin >=0.0.8 develop
- mkdocs-gen-files >=0.3.5 develop
- mkdocs-jupyter >=0.20.0 develop
- mkdocs-literate-nav >=0.4.1 develop
- mkdocs-material >=8.1.11 develop
- mypy >=1.11.1 develop
- nbconvert >=6.4.2 develop
- notebook ==6.4.3 develop
- numpydoc >=1.2 develop
- pre-commit >=3.8.0 develop
- pytest >=8.3.2 develop
- pytest-cov >=5.0.0 develop
- python-slugify * develop
- pyupgrade ==3.17.0 develop
- spacy >=3.2.2 develop
- watermark ==2.3.1 develop
- mkdocstrings >=0.19.0
- numpy ~1.26.4
- pandas ~2.2.2
- python >=3.10,<3.13
- pytkdocs >=0.5.0
- river *
- scikit-learn ~1.5.0
- sortedcontainers ^2.4.0
- torch ==2.2.2
- torchviz ~0.0.2
- tqdm ~4.66.5