anira
an architecture for neural network inference in real-time audio applications
Science Score: 57.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 5 DOI reference(s) in README -
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (11.1%) to scientific vocabulary
Keywords
Repository
an architecture for neural network inference in real-time audio applications
Basic Info
- Host: GitHub
- Owner: anira-project
- License: apache-2.0
- Language: C++
- Default Branch: main
- Homepage: https://anira-project.github.io/anira/
- Size: 1.27 MB
Statistics
- Stars: 179
- Watchers: 4
- Forks: 7
- Open Issues: 6
- Releases: 23
Topics
Metadata Files
README.md
Anira is a high-performance library designed to enable easy real-time safe integration of neural network inference within audio applications. Compatible with multiple inference backends, LibTorch, ONNXRuntime, and Tensorflow Lite, anira bridges the gap between advanced neural network architectures and real-time audio processing. In the paper you can find more information about the architecture and the design decisions of anira, as well as extensive performance evaluations with the built-in benchmarking capabilities.
Documentation
An extensive documentation of anira can be found at https://anira-project.github.io/anira/.
Features
- Real-time Safe Execution: Ensures deterministic runtimes suitable for real-time audio applications
- Thread Pool Management: Utilizes a static thread pool to avoid oversubscription and enables efficient parallel inference
- Minimal Latency: Designed to minimize latency while maintaining real-time safety
- Built-in Benchmarking: Includes tools for evaluating the real-time performance of neural networks
- Comprehensive Inference Engine Support: Integrates common inference engines, LibTorch, ONNXRuntime, and TensorFlow Lite
- Flexible Neural Network Integration: Supports a variety of neural network models, including stateful and stateless models
- Cross-Platform Compatibility: Works seamlessly on macOS, Linux, and Windows
Usage
The basic usage of anira is as follows:
```cpp
include
anira::InferenceConfig inference_config( {{"path/to/your/model.onnx", anira::InferenceBackend::ONNX}}, // Model path {{{256, 1, 1}}, {{256, 1}}}, // Input, Output shape 5.33f // Maximum inference time in ms );
// Create a pre- and post-processor instance anira::PrePostProcessor ppprocessor(inferenceconfig);
// Create an InferenceHandler instance anira::InferenceHandler inferencehandler(ppprocessor, inference_config);
// Pass the host configuration and allocate memory for audio processing inferencehandler.prepare({buffersize, sample_rate});
// Select the inference backend inferencehandler.setinference_backend(anira::InferenceBackend::ONNX);
// Optionally get the latency of the inference process in samples unsigned int latencyinsamples = inferencehandler.getlatency();
// Real-time safe audio processing in process callback of your application process(float** audiodata, int numsamples) { inferencehandler.process(audiodata, numsamples); } // audiodata now contains the processed audio samples ```
Installation
Anira can be easily integrated into your CMake project. You can either add anira as a submodule, download the pre-built binaries from the releases page, or build from source.
Option 1: Add as Git Submodule (Recommended)
```bash
Add anira repo as a submodule
git submodule add https://github.com/anira-project/anira.git modules/anira ```
In your CMakeLists.txt:
```cmake
Setup your project and target
project(yourproject) addexecutable(your_target main.cpp ...)
Add anira as a subdirectory
add_subdirectory(modules/anira)
Link your target to the anira library
targetlinklibraries(your_target anira::anira) ```
Option 2: Use Pre-built Binaries
Download pre-built binaries from the releases page.
In your CMakeLists.txt:
```cmake
Setup your project and target
project(yourproject) addexecutable(your_target main.cpp ...)
Add the path to the anira library as cmake prefix path and find the package
list(APPEND CMAKEPREFIXPATH "path/to/anira") find_package(anira REQUIRED)
Link your target to the anira library
targetlinklibraries(your_target anira::anira) ```
Option 3: Build from Source
bash
git clone https://github.com/anira-project/anira.git
cd anira
cmake . -B build -DCMAKE_BUILD_TYPE=Release
cmake --build build --config Release --target anira
cmake --install build --prefix /path/to/install/directory
Build options
By default, all three inference engines are installed. You can disable specific backends as needed:
- LibTorch:
-DANIRA_WITH_LIBTORCH=OFF - OnnxRuntime:
-DANIRA_WITH_ONNXRUNTIME=OFF - Tensorflow Lite:
-DANIRA_WITH_TFLITE=OFF
Moreover, the following options are available:
- Build anira with benchmark capabilities:
-DANIRA_WITH_BENCHMARK=ON - Build example applications, plugins and populate example neural models:
-DANIRA_WITH_EXAMPLES=ON - Build anira with tests:
-DANIRA_WITH_TESTS=ON - Build anira with documentation:
-DANIRA_WITH_DOCS=ON - Disable the logging system:
-DANIRA_WITH_LOGGING=OFF
Examples
Build in examples
- Simple JUCE Audio Plugin: Demonstrates how to use anira in a real-time audio JUCE / VST3-Plugin.
- CLAP Plugin Example: Demonstrates how to use anira in a real-time clap plugin.
- Benchmark: Demonstrates how to use anira for benchmarking of different neural network models, backends and audio configurations.
- Minimal Inference: Demonstrates how minimal inference applications can be implemented in all three backends.
Other examples
- nn-inference-template: Another more JUCE / VST3-Plugin that uses anira for real-time safe neural network inference. This plugin is more complex than the simple JUCE Audio Plugin example as it has a more appealing GUI.
Real-time safety
anira's real-time safety is checked in this repository with the rtsan sanitizer.
Citation
If you use anira in your research or project, please cite either the paper or the software itself:
```bibtex @inproceedings{ackvaschulz2024anira, author={Ackva, Valentin and Schulz, Fares}, booktitle={2024 IEEE 5th International Symposium on the Internet of Sounds (IS2)}, title={ANIRA: An Architecture for Neural Network Inference in Real-Time Audio Applications}, year={2024}, volume={}, number={}, pages={1-10}, publisher={IEEE}, doi={10.1109/IS262782.2024.10704099} }
@software{ackvaschulz2024anira, author = {Valentin Ackva and Fares Schulz}, title = {anira: an architecture for neural network inference in real-time audio application}, url = {https://github.com/anira-project/anira}, version = {x.x.x}, year = {2024}, } ```
Contributors
License
This project is licensed under Apache-2.0.
Owner
- Name: anira-project
- Login: anira-project
- Kind: organization
- Repositories: 1
- Profile: https://github.com/anira-project
Citation (CITATION.cff)
cff-version: 1.2.0
message: If you use this software, please cite both the article from preferred-citation and the software itself.
authors:
- family-names: Ackva
given-names: Valentin
- family-names: Schulz
given-names: Fares
title: 'ANIRA: An Architecture for Neural Network Inference in Real-Time Audio Applications'
version: 1.0.0
doi: 10.1109/IS262782.2024.10704099
date-released: '2024-11-09'
preferred-citation:
authors:
- family-names: Ackva
given-names: Valentin
- family-names: Schulz
given-names: Fares
title: 'ANIRA: An Architecture for Neural Network Inference in Real-Time Audio Applications'
doi: 10.1109/IS262782.2024.10704099
type: conference-paper
pages: 1-10
year: '2024'
collection-title: 2024 IEEE 5th International Symposium on the Internet of Sounds (IS2)
conference: {}
publisher:
name: IEEE
GitHub Events
Total
- Create event: 31
- Release event: 11
- Issues event: 12
- Watch event: 63
- Delete event: 27
- Member event: 2
- Issue comment event: 42
- Push event: 247
- Pull request review comment event: 1
- Pull request review event: 3
- Pull request event: 41
- Fork event: 5
Last Year
- Create event: 31
- Release event: 11
- Issues event: 12
- Watch event: 63
- Delete event: 27
- Member event: 2
- Issue comment event: 42
- Push event: 247
- Pull request review comment event: 1
- Pull request review event: 3
- Pull request event: 41
- Fork event: 5
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 3
- Total pull requests: 13
- Average time to close issues: 13 days
- Average time to close pull requests: about 6 hours
- Total issue authors: 3
- Total pull request authors: 2
- Average comments per issue: 1.33
- Average comments per pull request: 0.08
- Merged pull requests: 6
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 3
- Pull requests: 13
- Average time to close issues: 13 days
- Average time to close pull requests: about 6 hours
- Issue authors: 3
- Pull request authors: 2
- Average comments per issue: 1.33
- Average comments per pull request: 0.08
- Merged pull requests: 6
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- mchagneux (6)
- leoauri (2)
- MarcoRavich (1)
- Andonvr (1)
- olilarkin (1)
- mrcelli (1)
Pull Request Authors
- faressc (12)
- vackva (7)
- themaxw (1)
- olilarkin (1)
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- actions/checkout v4 composite
- actions/download-artifact v4 composite
- actions/upload-artifact v4 composite
- mozilla-actions/sccache-action v0.0.3 composite
- softprops/action-gh-release v2 composite
