https://github.com/danielathome19/engram-neural-network

tensorflow-engram: A Python package for Engram Neural Networks, adding biologically-inspired Hebbian memory and engram layers to TensorFlow/Keras models, supporting memory traces, plasticity, attention, and sparsity for neural sequence learning.

https://github.com/danielathome19/engram-neural-network

Science Score: 49.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 5 DOI reference(s) in README
  • Academic publication links
    Links to: zenodo.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (12.8%) to scientific vocabulary

Keywords

artificial-intelligence biological-neural-networks computational-neuroscience deep-learning engram engram-neural-networks engrams enn hebbian-learning keras machine-learning memory memory-augmented-networks neuroscience python python-package sparsity tensorflow
Last synced: 5 months ago · JSON representation

Repository

tensorflow-engram: A Python package for Engram Neural Networks, adding biologically-inspired Hebbian memory and engram layers to TensorFlow/Keras models, supporting memory traces, plasticity, attention, and sparsity for neural sequence learning.

Basic Info
Statistics
  • Stars: 1
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Topics
artificial-intelligence biological-neural-networks computational-neuroscience deep-learning engram engram-neural-networks engrams enn hebbian-learning keras machine-learning memory memory-augmented-networks neuroscience python python-package sparsity tensorflow
Created 9 months ago · Last pushed 7 months ago
Metadata Files
Readme License

README.md

tensorflow-engram

PyPI Downloads Conda Downloads CI/CT/CD License DOI

Engram Neural Networks (ENNs): Hebbian Memory-Augmented Recurrent Networks

Biologically-inspired memory for TensorFlow/Keras.

Add Hebbian/engram learning to your neural networks with just a few lines of code.


Overview

tensorflow-engram provides Keras layers, models, and utilities for building neural networks with biologically-inspired memory mechanisms, including Hebbian plasticity, engram-like trace formation, attention, and sparse memory recall. This enables powerful sequence modeling, few-shot learning, continual learning, and analysis of memory traces within modern deep learning pipelines.

  • Seamless TensorFlow/Keras integration
  • Engram layers: RNN cells and wrappers with memory banks, plastic synapses, and sparsity
  • Hebbian learning: Fast local updates + gradient learning
  • Attention and sparsity: Focuses on the most relevant memories
  • Trace monitoring: Visualize engram and memory trace evolution
  • Ready-to-use models for classification and regression

Tensorflow-Engram is currently in development and may not yet be ready for production use. We are actively seeking contributors to help us improve the package and expand its capabilities. If you are interested in contributing, please see our contributing guide.

TODO:

  • Add unit tests
  • Generate doc pages with Sphinx
  • Possibly rename repo?

Installation

bash pip install tensorflow-engram

Or install using conda:

bash conda install -c danielathome19 tensorflow-engram


Requirements:

  • Python 3.12+
  • TensorFlow 2.19+
  • Keras 3.10+
  • numpy, seaborn, matplotlib, pandas (for utilities and plotting)

Quickstart

Example: MNIST Classification with Engram Memory

```python import numpy as np from tensorflow.keras.datasets import mnist from tensorflow.keras.utils import tocategorical from sklearn.modelselection import traintestsplit from tensorflowengram.models import EngramClassifier from tensorflowengram.utils import HebbianTraceMonitor, plothebbiantrace

Prepare data

(xtrain, ytrain), (xtest, ytest) = mnist.loaddata() xtrain = xtrain.astype('float32') / 255.0 xtest = xtest.astype('float32') / 255.0 ytrain = tocategorical(ytrain, 10) ytest = tocategorical(ytest, 10) xtrain = xtrain.reshape(-1, 28, 28) xtest = xtest.reshape(-1, 28, 28) xtrain, xval, ytrain, yval = traintestsplit(xtrain, ytrain, testsize=0.1)

Build model

model = EngramClassifier( inputshape=(28, 28), numclasses=10, hiddendim=128, memorysize=64, returnstates=True, hebbianlr=0.05, )

Monitor Hebbian trace during training

tracecallback = HebbianTraceMonitor(xtrain[:32], log_dir=None)

model.compile( optimizer='adam', loss='categoricalcrossentropy', metrics=['accuracy'] ) model.fit( xtrain, ytrain, batchsize=128, epochs=10, validationdata=(xval, yval), callbacks=[tracecallback] )

Visualize trace evolution

plothebbiantrace(trace_callback) ```


Features

  • EngramCell: Biologically-inspired RNN cell with memory banks and Hebbian plasticity.
  • EngramNetwork: High-level Keras Model for sequence modeling.
  • Attention Layer: Optional attention mechanism for sequence summarization.
  • Trace Monitoring: Inspect and visualize memory trace evolution with built-in callbacks and plotting utilities.

API Highlights

Layers

  • EngramCell: Biologically-inspired RNN cell with memory banks, Hebbian trace, and sparsity regularization.

  • Engram: Wrapper for Keras models/networks using EngramCell.

  • EngramAttentionLayer: Optional attention over sequence outputs.

Models

  • EngramNetwork: General-purpose sequence model with configurable memory and plasticity.

  • EngramClassifier: Factory function for classification tasks.

  • EngramRegressor: Factory for regression tasks.

Utilities

  • HebbianTraceMonitor: Keras callback for logging and visualizing Hebbian traces.

  • plot_hebbian_trace: Quick plotting of trace evolution and statistics.


How It Works

  • Memory Bank: Persistent, learnable memory vectors (engrams), updated via gradient descent.

  • Hebbian Trace: Rapidly updated, plastic component reflecting short-term memory, updated via local Hebbian learning.

  • Attention/Recall & Sparsity: Memories are retrieved by attention (cosine similarity + softmax), but with sparsity constraints so only a few are activated per input—mimicking efficient biological memory recall.

  • Trace Visualization: Built-in tools to monitor and understand the dynamics of memory during training.


Advanced Usage

You can customize the cell and models for your own tasks:

```python from tensorflow_engram.layers import EngramCell from tensorflow.keras.layers import RNN, Input from tensorflow.keras.models import Model

cell = EngramCell(hiddendim=64, memorysize=32) inputs = Input(shape=(None, 16)) rnnlayer = RNN(cell, returnsequences=True) outputs = rnn_layer(inputs) model = Model(inputs, outputs) ```


License

Tensorflow-Engram is licensed under the BSD-3 License. See the LICENSE file for more information.


Citation

If you use this code for your research, please cite this project as:

bibtex @software{Szelogowski_tensorflow_engram_2025, author = {Szelogowski, Daniel}, doi = {10.48550/arXiv.2507.21474}, license = {BSD-3-Clause}, month = {jul}, title = {{tensorflow-engram: A Python package for Engram Neural Networks, adding biologically-inspired Hebbian memory and engram layers to TensorFlow/Keras models, supporting memory traces, plasticity, attention, and sparsity for neural sequence learning.}}, url = {https://github.com/danielathome19/Engram-Neural-Network}, version = {0.1.0}, year = {2025} }

or as the corresponding research paper:

bibtex @misc{Szelogowski_Simulation_of_Neural_Responses_Using_OI_2024, author = {Szelogowski, Daniel}, doi = {10.48550/arXiv.2507.21474}, month = {jul}, title = {{Hebbian Memory-Augmented Recurrent Networks: Engram Neurons in Deep Learning}}, url = {https://github.com/danielathome19/Engram-Neural-Network}, year = {2025} }

Owner

  • Name: Daniel J. Szelogowski
  • Login: danielathome19
  • Kind: user
  • Location: Wisconsin
  • Company: @MECS-Research-Lab

Standing on the shoulders of giants.

GitHub Events

Total
  • Watch event: 1
  • Push event: 9
  • Public event: 1
Last Year
  • Watch event: 1
  • Push event: 9
  • Public event: 1

Packages

  • Total packages: 1
  • Total downloads: unknown
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 1
  • Total maintainers: 1
pypi.org: tensorflow-engram

A Python package for Engram Neural Networks, adding biologically-inspired Hebbian memory and engram layers to TensorFlow/Keras models, supporting memory traces, plasticity, attention, and sparsity for neural sequence learning.

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent packages count: 8.7%
Average: 28.9%
Dependent repos count: 49.2%
Maintainers (1)
Last synced: 6 months ago

Dependencies

requirements.txt pypi
  • Markdown ==3.8
  • MarkupSafe ==3.0.2
  • PyYAML ==6.0.2
  • Pygments ==2.19.1
  • Werkzeug ==3.1.3
  • absl-py ==2.2.2
  • aiohappyeyeballs ==2.6.1
  • aiohttp ==3.11.18
  • aiosignal ==1.3.2
  • asttokens ==3.0.0
  • astunparse ==1.6.3
  • attrs ==25.3.0
  • certifi ==2025.4.26
  • charset-normalizer ==3.4.2
  • comm ==0.2.2
  • contourpy ==1.3.2
  • cycler ==0.12.1
  • datasets ==3.6.0
  • debugpy ==1.8.14
  • decorator ==5.2.1
  • dill ==0.3.8
  • executing ==2.2.0
  • filelock ==3.18.0
  • flatbuffers ==25.2.10
  • fonttools ==4.58.0
  • frozenlist ==1.6.0
  • fsspec ==2025.3.0
  • gast ==0.6.0
  • google-pasta ==0.2.0
  • graphviz ==0.20.3
  • grpcio ==1.71.0
  • h5py ==3.13.0
  • huggingface-hub ==0.31.4
  • idna ==3.10
  • ipykernel ==6.29.5
  • ipython ==9.2.0
  • ipython_pygments_lexers ==1.1.1
  • jedi ==0.19.2
  • joblib ==1.5.0
  • jupyter_client ==8.6.3
  • jupyter_core ==5.7.2
  • keras ==3.10.0
  • kiwisolver ==1.4.8
  • libclang ==18.1.1
  • markdown-it-py ==3.0.0
  • matplotlib ==3.8.4
  • matplotlib-inline ==0.1.7
  • mdurl ==0.1.2
  • ml_dtypes ==0.5.1
  • multidict ==6.4.4
  • multiprocess ==0.70.16
  • namex ==0.0.9
  • narwhals ==1.40.0
  • nest-asyncio ==1.6.0
  • numpy ==2.1.3
  • opt_einsum ==3.4.0
  • optree ==0.15.0
  • packaging ==25.0
  • pandas ==2.2.2
  • parso ==0.8.4
  • patsy ==1.0.1
  • pexpect ==4.9.0
  • pillow ==11.2.1
  • platformdirs ==4.3.8
  • plotly ==6.0.1
  • prompt_toolkit ==3.0.51
  • propcache ==0.3.1
  • protobuf ==5.29.4
  • psutil ==7.0.0
  • ptyprocess ==0.7.0
  • pure_eval ==0.2.3
  • pyarrow ==20.0.0
  • pydot ==4.0.0
  • pyparsing ==3.2.3
  • python-dateutil ==2.9.0.post0
  • pytz ==2025.2
  • pyzmq ==26.4.0
  • requests ==2.32.3
  • rich ==14.0.0
  • scikit-learn ==1.6.1
  • scipy ==1.15.2
  • seaborn ==0.13.2
  • setuptools ==80.8.0
  • six ==1.17.0
  • stack-data ==0.6.3
  • statsmodels ==0.14.4
  • tensorboard ==2.19.0
  • tensorboard-data-server ==0.7.2
  • tensorflow ==2.19.0
  • termcolor ==3.1.0
  • threadpoolctl ==3.6.0
  • tornado ==6.5
  • tqdm ==4.67.1
  • traitlets ==5.14.3
  • typing_extensions ==4.13.2
  • tzdata ==2025.2
  • urllib3 ==2.4.0
  • wcwidth ==0.2.13
  • wheel ==0.45.1
  • wrapt ==1.17.2
  • xxhash ==3.5.0
  • yarl ==1.20.0
setup.py pypi