jaxsnn

jaxsnn is an event-based approach to machine-learning-inspired training and simulation of SNNs, including support for the BrainScaleS-2 neuromorphic backend.

https://github.com/electronicvisions/jaxsnn

Science Score: 75.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: frontiersin.org
  • Academic email domains
  • Institutional organization owner
    Organization electronicvisions has institutional domain (www.kip.uni-heidelberg.de)
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.6%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

jaxsnn is an event-based approach to machine-learning-inspired training and simulation of SNNs, including support for the BrainScaleS-2 neuromorphic backend.

Basic Info
  • Host: GitHub
  • Owner: electronicvisions
  • License: lgpl-2.1
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 5.89 MB
Statistics
  • Stars: 21
  • Watchers: 3
  • Forks: 2
  • Open Issues: 0
  • Releases: 2
Created almost 3 years ago · Last pushed 7 months ago
Metadata Files
Readme License Citation

README.md

/ˈdʒæksən/

jaxsnn

jaxsnn (pronounced like Jackson /ˈdʒæksən/) is an event-based approach to machine-learning-inspired training and simulation of SNNs, including support for neuromorphic backends (BrainScaleS-2). We build upon jax, a Python library providing autograd and XLA functionality for high-performance machine learning research.

Installation

We provide a pypi build of the software that lacks support for the BrainScaleS-2 neuromorphic hardware system. The usual pip install jaxsnn stuff should work, but YMMV.

Building the Software

The software builds upon existing libraries, such as jax, optax, and tree-math. When using the neuromorphic BrainScaleS-2 backend, the software stack of the platform is required.

We provide a container image (based on the Apptainer format) including all build-time and runtime dependencies. Feel free to download the most recent version from here.

For all following steps, we assume that the most recent Apptainer container is located at /containers/stable/latest.

Github-based Build

To build this project from public resources, adhere to the following guide:

```shell

1) Most of the following steps will be executed within a apptainer container

To keep the steps clutter-free, we start by defining an alias

shopt -s expand_aliases alias c="apptainer exec --app dls /containers/stable/latest"

2) Prepare a fresh workspace and change directory into it

mkdir workspace && cd workspace

3) Fetch a current copy of the symwaf2ic build tool

git clone https://github.com/electronicvisions/waf -b symwaf2ic symwaf2ic

4) Build symwaf2ic

c make -C symwaf2ic ln -s symwaf2ic/waf

5) Setup your workspace and clone all dependencies (--clone-depth=1 to skip history)

c ./waf setup --repo-db-url=https://github.com/electronicvisions/projects --project=jaxsnn

6) Load PPU cross-compiler toolchain (or build https://github.com/electronicvisions/oppulance)

module load ppu-toolchain

7) Build the project

Adjust -j1 to your own needs, beware that high parallelism will increase memory consumption!

c ./waf configure c ./waf build -j1

8) Install the project to ./bin and ./lib

c ./waf install

9) If you run programs outside waf, you'll need to add ./lib and ./bin to your path specifications

export APPTAINERENVPREPENDPATH=pwd/bin:$APPTAINERENVPREPENDPATH export APPTAINERENVLDLIBRARYPATH=pwd/lib:$APPTAINERENVLDLIBRARYPATH export PYTHONPATH=pwd/lib:$PYTHONPATH export PYTHONPATH=pwd/lib/python3.10/site-packages:$PYTHONPATH

10) To validate that your build was successful, execute the following example

python -m jaxsnn.examples.event.yinyang_analytical ```

Structure

jaxsnn is split into two parts. Training of SNNs is done in the init/apply style.

Time Discrete

jaxsnn.discrete simulates SNNs by treating time in a discrete way. It uses euler steps of a fixed size to advance the network forward in time which draws inspiration from norse.

Time Continuous

jaxsnn.event treats time continously and allows jumping from one event to the next one. It's core functionality consists of the step function, which does three things:

  1. Find the next threshold crossing
  2. Integrate the neuron to this point in time
  3. Apply the discontinuity after the threshold crossing

jaxsnn.event.modules.leaky_integrate_and_fire provides multiple neuron types which can be used to build larger networks. Each neuron type defines the three functions mentioned above.

BSS-2 Connection

jaxsnn.event.hardware provides functionality to connect to the BSS-2 system and to conduct learning experiments on dedicated neuromorphic hardware.

First Steps

We provide multiple examples for usage of jaxsnn.

Time discrete learning using surrogate gradients on the Yin-Yang dataset:

bash python -m jaxsnn.examples.discrete.yinyang

Event-based two layer feed-forward network with analytical gradients:

bash python -m jaxsnn.examples.event.yinyang_analytical

Event-based two-layer feed-forward network with gradients computed using the EventProp algorithm:

bash python -m jaxsnn.examples.event.yinyang_layered_event_prop

Event-based recurrent network with gradients computed using the EventProp algorithm:

bash python -m jaxsnn.examples.event.yinyang_recurrent_event_prop

BSS-2

If you want to work with the BSS-2 system, a working example is provided:

bash python -m jaxsnn.examples.event.yinyang_bss2

The operation point calibration script is src/pyjaxsnn/jaxsnn/event/hardware/calib/neuron_calib.py. Example:

bash srun -p cube --wafer 69 --fpga-without-aout 0 --pty c python ./neuron_calib.py \ --wafer W69F0 \ --threshold 150 \ --tau-syn 6e-6 \ --tau-mem 12e-6 \ --refractory-time 30e-6 \ --synapse-dac-bias 1000 --calib-dir src/pyjaxsnn/jaxsnn/event/hardware/calib

If you want to study the behaviour that different hardware artifacts (noise on the spike times) have on the performance of SNNs, check out this example:

bash python -m jaxsnn.examples.event.hardware.yinyang_mock

You can switch between an actual execution on BSS-2 and a pure software mock mode, in which the hardware is emulated by a second software network. You can add noise to spikes from this first network or limit the dynamic range (like it is on BSS-2).

TODO

  • The mapping between the hardware neuron modules HardwareRecurrentLIF (which can simulate multiple feed-forward layers) and the populations / projections is not yet implemented cleanly and is hacked into the tasks (experiment returns a list of spikes for two layers, which are merged together, projections are hardcoded)

Acknowledgements

The software in this repository has been developed by staff and students of Heidelberg University as part of the research carried out by the Electronic Vision(s) group at the Kirchhoff-Institute for Physics.

This work has received funding from the EC Horizon 2020 Framework Programme under grant agreements 785907 (HBP SGA2) and 945539 (HBP SGA3), the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy EXC 2181/1-390900948 (the Heidelberg STRUCTURES Excellence Cluster), the German Federal Ministry of Education and Research under grant number 16ES1127 as part of the Pilotinnovationswettbewerb Energieeffizientes KI-System, the Helmholtz Association Initiative and Networking Fund [Advanced Computing Architectures (ACA)] under Project SO-092, as well as from the Manfred Stärk Foundation, and the Lautenschläger-Forschungspreis 2018 for Karlheinz Meier.

Licensing

SPDX-License-Identifier: LGPL-2.1-or-later

Owner

  • Name: Electronic Vision(s) Group — BrainScaleS Neuromorphic Hardware
  • Login: electronicvisions
  • Kind: organization
  • Location: Heidelberg, Germany

Kirchhoff-Institute for Physics, Ruprecht-Karls-Universität Heidelberg

Citation (CITATION.cff)

cff-version: 1.2.0
title: jaxsnn
message: >-
  If you use this software, please cite it using the
  metadata from this file.
type: software
authors:
  - name: Electronic Vision(s)
    address: >-
      European Institute for Neuromorphic Computing (KIP),
      INF 225a
    city: Heidelberg
    post-code: '69120'
  - family-names: Althaus
    given-names: Moritz
    affiliation: "Kirchhoff-Institut für Physik, Ruprecht-Karls-Universität Heidelberg"
  - family-names: Pehle
    given-names: Christian
    affiliation: "Cold Spring Harbor Laboratory"
  - family-names: Müller
    given-names: Eric
    affiliation: "Kirchhoff-Institut für Physik, Ruprecht-Karls-Universität Heidelberg"
contact:
  - affiliation: "Cold Spring Harbor Laboratory"
    family-names: Pehle
    given-names: Christian
  - affiliation: "Kirchhoff-Institut für Physik, Ruprecht-Karls-Universität Heidelberg"
    family-names: Müller
    given-names: Eric
identifiers:
  - type: doi
    value: 10.5281/zenodo.10254666
    description: v0.1.0
repository-code: 'https://github.com/electronicvisions/jaxsnn'
abstract: >-
  jaxsnn is an event-based approach to
  machine-learning-inspired training and simulation of SNNs,
  including support for neuromorphic backends
  (BrainScaleS-2). We build upon jax, a Python library
  providing autograd and XLA functionality for
  high-performance machine learning research.
keywords:
  - spiking neural networks
  - machine learning
  - neuromorphic hardware
license: LGPL-2.1-or-later
version: 0.1.0
date-released: '2023-12-04'

GitHub Events

Total
  • Watch event: 9
  • Push event: 30
  • Fork event: 1
  • Create event: 4
Last Year
  • Watch event: 9
  • Push event: 30
  • Fork event: 1
  • Create event: 4

Issues and Pull Requests

Last synced: 9 months ago

All Time
  • Total issues: 0
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 0
  • Total pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels

Dependencies

pyproject.toml pypi
  • jax ==0.4.13; python_version=='3.8'
  • jax >=0.4.13; python_version>'3.8'
  • jaxlib ==0.4.13; python_version=='3.8'
  • jaxlib >=0.4.13; python_version>'3.8'
  • matplotlib *
  • optax >=0.1.4
  • tree-math >0.1.0