Science Score: 54.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (13.0%) to scientific vocabulary
Repository
A C++ library for neural networks
Basic Info
- Host: GitHub
- Owner: wiegerw
- License: bsl-1.0
- Language: C++
- Default Branch: main
- Size: 2.02 MB
Statistics
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md

nerva-rowwise
nerva-rowwise is a C++ library for implementing and experimenting with neural networks. It is part of the broader Nerva library collection, which includes native Python bindings and tools. Originally developed for research in truly sparse neural networks, nerva-rowwise now also aims to provide a transparent and accessible implementation of core neural network components.
Features
| Feature | Status | |----------------------------------|----------------| | Row-wise dataset layout (like PyTorch) | ✅ Supported | | Common layers, activations, and losses | ✅ Supported | | Mini-batch training | ✅ Supported | | Sparse layers using CSR | ✅ Supported | | Python bindings | ✅ Supported | | CPU support (Intel MKL backend) | ✅ Supported | | GPU support | ❌ Not yet | | Convolutional / Transformer layers | ❌ Not yet |
Documentation
Detailed documentation is available for both the C++ and Python interfaces:
- C++ Manual – build instructions, tools, and API.
- Python Manual – usage of the
nervaPython module. - Mathematical Specifications (PDF)
Relevant papers:
- Nerva: a Truly Sparse Implementation of Neural Networks
- Batch Matrix-form Equations and Implementation of Multilayer Perceptrons (🔗 TODO)
Getting Started
C++ users
Install using CMake or B2. See the C++ manual for details.
Python users
Install the Python bindings via pip. See the Python manual for instructions.
Example: Training with the command line tool mlp
sh
../install/bin/mlp \
--layers="ReLU;ReLU;Linear" \
--layer-sizes="3072;1024;1024;10" \
--layer-weights=Xavier \
--optimizers="Nesterov(0.9)" \
--loss=SoftmaxCrossEntropy \
--learning-rate=0.01 \
--epochs=100 \
--batch-size=100 \
--threads=12 \
--overall-density=0.05 \
--dataset=../data/cifar10-flattened.npz \
--seed=123
For full CLI documentation, see the manual section on mlp.
Design Philosophy
The library is built for:
- Research in sparse training (e.g., pruning/growth algorithms)
- Transparency: backpropagation is implemented explicitly (no autograd)
- Modularity: the core operations rely on a small set of primitive matrix operations
Performance
- Focused on CPU performance via Intel MKL
- Implementation modularity makes it easy to experiment with other matrix backends
- Dynamic sparse training is fully supported
Future Work
- GPU implementation for dynamic sparse training
- Support for convolutional, pooling, and transformer layers
- Expanded layer/activation/loss library
Comparison with Other Frameworks
Nerva is not a drop-in replacement for PyTorch or TensorFlow. Instead, it is intended for:
- Research into sparsity and neural net structure
- Educational use to understand backpropagation and architecture
- Users who need explicit control over the neural network implementation
Contact
Questions or contributions welcome!
Contact: Wieger Wesselink (j.w.wesselink@tue.nl)
Owner
- Name: Wieger Wesselink
- Login: wiegerw
- Kind: user
- Repositories: 4
- Profile: https://github.com/wiegerw
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: Wesselink
given-names: J.W.
orcid: https://orcid.org/0009-0001-6746-5115
title: "Nerva library"
version: 0.1.0
date-released: 2024-08-28
url: "https://github.com/wiegerw/nerva-rowwise"
GitHub Events
Total
- Watch event: 1
- Push event: 102
Last Year
- Watch event: 1
- Push event: 102