https://github.com/computationalpsychiatry/pyhgf

PyHGF: A neural network library for predictive coding

https://github.com/computationalpsychiatry/pyhgf

Science Score: 49.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 12 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (18.1%) to scientific vocabulary

Keywords

active-inference bayesian-filter bayesian-inference belief-propagation computational-psychiatry graph-neural-networks hierarchical-gaussian-filter jax neural-networks predictive-coding probabilistic-graphical-models reinforcement-learning state-space-model
Last synced: 5 months ago · JSON representation

Repository

PyHGF: A neural network library for predictive coding

Basic Info
Statistics
  • Stars: 93
  • Watchers: 7
  • Forks: 23
  • Open Issues: 21
  • Releases: 34
Topics
active-inference bayesian-filter bayesian-inference belief-propagation computational-psychiatry graph-neural-networks hierarchical-gaussian-filter jax neural-networks predictive-coding probabilistic-graphical-models reinforcement-learning state-space-model
Created over 5 years ago · Last pushed 6 months ago
Metadata Files
Readme License

README.md

pre-commit license codecov black mypy Imports: isort pip

PyHGF: A Neural Network Library for Predictive Coding

hgf

PyHGF is a Python library for creating and manipulating dynamic probabilistic networks for predictive coding. These networks approximate Bayesian inference by optimizing beliefs through the diffusion of predictions and precision-weighted prediction errors. The network structure remains flexible during message-passing steps, allowing for dynamic adjustments. They can be used as a biologically plausible cognitive model in computational neuroscience or as a generalization of Bayesian filtering for designing efficient, modular decision-making agents. The default implementation supports the generalized Hierarchical Gaussian Filters (gHGF, Weber et al., 2024), but the framework is designed to be adaptable to other algorithms. Built on top of JAX, the core functions are differentiable and JIT-compiled where applicable. The library is optimized for modularity and ease of use, allowing seamless integration with other libraries in the ecosystem for Bayesian inference and optimization. Additionally, a binding with an implementation in Rust is under active development, which will further enhance flexibility during inference. You can find the method paper describing the toolbox here and the method paper describing the gHGF, which is the main framework currently supported by the toolbox here.

Getting started

Installation

The last official release can be downloaded from PIP:

pip install pyhgf

The current version under development can be installed from the master branch of the GitHub folder:

pip install “git+https://github.com/ComputationalPsychiatry/pyhgf.git”

How does it work?

Dynamic networks can be defined as a tuple containing the following variables:

  • The attributes (dictionary) that store each node's states and parameters (e.g. value, precision, learning rates, volatility coupling, ...).
  • The edges (tuple) that lists, for each node, the indexes of the parents and children.
  • A set of update functions. An update function receive a network tuple and returns an updated network tuple.
  • An update sequence (tuple) that defines the order and target of the update functions.

networks

You can find a deeper introduction to how to create and manipulate networks under the following link:

The Generalized Hierarchical Gaussian Filter

Generalized Hierarchical Gaussian Filters (gHGF) are specific instances of dynamic networks where node encodes a Gaussian distribution that can inherit its value (mean) and volatility (variance) from other nodes. The presentation of a new observation at the lowest level of the hierarchy (i.e., the input node) triggers a recursive update of the nodes' belief (i.e., posterior distribution) through top-down predictions and bottom-up precision-weighted prediction errors. The resulting probabilistic network operates as a Bayesian filter, and a response function can parametrize actions/decisions given the current beliefs. By comparing those behaviours with actual outcomes, a surprise function can be optimized over a set of free parameters. The Hierarchical Gaussian Filter for binary and continuous inputs was first described in Mathys et al. (2011, 2014), and later implemented in the Matlab HGF Toolbox (part of TAPAS (Frässle et al. 2021).

You can find a deeper introduction on how does the gHGF works under the following link:

Model fitting

Here we demonstrate how to fit forwards a two-level binary Hierarchical Gaussian filter. The input time series are binary observations using an associative learning task Iglesias et al. (2013).

```python from pyhgf.model import Network from pyhgf import load_data

Load time series example data (observations, decisions)

u, y = load_data("binary")

Create a two-level binary HGF from scratch

hgf = ( Network() .addnodes(kind="binary-state") .addnodes(kind="continuous-state", value_children=0) )

add new observations

hgf.inputdata(inputdata=u)

visualization of the belief trajectories

hgf.plot_trajectories(); ```

png

```python from pyhgf.response import binarysoftmaxinverse_temperature

compute the model's surprise (-log(p))

using the binary softmax with inverse temperature as the response model

surprise = hgf.surprise( responsefunction=binarysoftmaxinversetemperature, responsefunctioninputs=y, responsefunctionparameters=4.0 ) print(f"Sum of surprises = {surprise.sum()}") ```

Model's surprise = 138.8992462158203

Acknowledgments

This implementation of the Hierarchical Gaussian Filter was inspired by the original Matlab HGF Toolbox. A Julia implementation is also available here.

References

  1. Legrand, N., Weber, L., Waade, P. T., Daugaard, A. H. M., Khodadadi, M., Mikuš, N., & Mathys, C. (2024). pyhgf: A neural network library for predictive coding (Version 1). arXiv. https://doi.org/10.48550/ARXIV.2410.09206
  2. Mathys, C. (2011). A Bayesian foundation for individual learning under uncertainty. In Frontiers in Human Neuroscience (Vol. 5). Frontiers Media SA. https://doi.org/10.3389/fnhum.2011.00039
  3. Mathys, C. D., Lomakina, E. I., Daunizeau, J., Iglesias, S., Brodersen, K. H., Friston, K. J., & Stephan, K. E. (2014). Uncertainty in perception and the hierarchical Gaussian filter. Frontiers in Human Neuroscience, 8. https://doi.org/10.3389/fnhum.2014.00825
  4. Weber, L. A., Waade, P. T., Legrand, N., Møller, A. H., Stephan, K. E., & Mathys, C. (2023). The generalized Hierarchical Gaussian Filter (Version 2). arXiv. https://doi.org/10.48550/ARXIV.2305.10937
  5. Frässle, S., Aponte, E. A., Bollmann, S., Brodersen, K. H., Do, C. T., Harrison, O. K., Harrison, S. J., Heinzle, J., Iglesias, S., Kasper, L., Lomakina, E. I., Mathys, C., Müller-Schrader, M., Pereira, I., Petzschner, F. H., Raman, S., Schöbi, D., Toussaint, B., Weber, L. A., … Stephan, K. E. (2021). TAPAS: An Open-Source Software Package for Translational Neuromodeling and Computational Psychiatry. In Frontiers in Psychiatry (Vol. 12). Frontiers Media SA. https://doi.org/10.3389/fpsyt.2021.680811
  6. Iglesias, S., Kasper, L., Harrison, S. J., Manka, R., Mathys, C., & Stephan, K. E. (2021). Cholinergic and dopaminergic effects on prediction error and uncertainty responses during sensory associative learning. In NeuroImage (Vol. 226, p. 117590). Elsevier BV. https://doi.org/10.1016/j.neuroimage.2020.117590

Owner

  • Name: TAPAS
  • Login: ComputationalPsychiatry
  • Kind: organization

GitHub Events

Total
  • Create event: 49
  • Release event: 12
  • Issues event: 10
  • Watch event: 30
  • Delete event: 41
  • Member event: 1
  • Issue comment event: 38
  • Push event: 249
  • Pull request review event: 5
  • Pull request review comment event: 9
  • Pull request event: 85
  • Fork event: 2
Last Year
  • Create event: 49
  • Release event: 12
  • Issues event: 10
  • Watch event: 30
  • Delete event: 41
  • Member event: 1
  • Issue comment event: 38
  • Push event: 249
  • Pull request review event: 5
  • Pull request review comment event: 9
  • Pull request event: 85
  • Fork event: 2

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 8
  • Total pull requests: 33
  • Average time to close issues: 6 months
  • Average time to close pull requests: 11 days
  • Total issue authors: 4
  • Total pull request authors: 3
  • Average comments per issue: 1.0
  • Average comments per pull request: 0.67
  • Merged pull requests: 24
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 6
  • Pull requests: 33
  • Average time to close issues: 3 months
  • Average time to close pull requests: 11 days
  • Issue authors: 4
  • Pull request authors: 3
  • Average comments per issue: 1.17
  • Average comments per pull request: 0.67
  • Merged pull requests: 24
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • LegrandNico (4)
  • cgoemaere (1)
  • SylvainEstebe (1)
Pull Request Authors
  • LegrandNico (39)
  • LouieMH (4)
  • SylvainEstebe (4)
Top Labels
Issue Labels
networks (2) sampling (1) enhancement (1) nodes (1) good first issue (1) plotting (1)
Pull Request Labels
update functions (5) networks (5) documentation (5) rust (4) plotting (2) nodes (2) enhancement (2) bug (1) models (1)

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 296 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 34
  • Total maintainers: 1
pypi.org: pyhgf

Dynamic neural networks for predictive coding

  • Versions: 34
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 296 Last month
Rankings
Dependent packages count: 6.6%
Downloads: 15.6%
Average: 17.6%
Dependent repos count: 30.6%
Maintainers (1)
Last synced: 6 months ago

Dependencies

.github/workflows/docs.yml actions
  • JamesIves/github-pages-deploy-action v4 composite
  • actions/checkout v3 composite
  • actions/setup-python v1 composite
.github/workflows/linting.yml actions
  • actions/cache v2 composite
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
.github/workflows/test.yml actions
  • actions/cache v2 composite
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
  • codecov/codecov-action v3 composite
requirements-docs.txt pypi
  • bokeh ==2.3.2
  • graphviz *
  • jupyter_sphinx >=0.4.0
  • myst-nb >=0.16.0
  • myst-parser *
  • numpydoc >=1.4.0
  • pydata-sphinx-theme >=0.12.0
  • sphinx >=5.3.0
  • sphinx-bootstrap-theme >=0.8.1
  • sphinx-proof *
  • sphinxcontrib-bibtex >=2.4.2
  • systole *
  • watermark *
  • xarray >=2022.6.0
requirements-tests.txt pypi
  • seaborn >=0.11.2 test
requirements.txt pypi
  • arviz >=0.12.0
  • jax <=0.4.1
  • jaxlib <=0.4.1
  • matplotlib >=3.0.2
  • numba >=0.56.4
  • numpy >=1.18,<=1.23
  • packaging *
  • pymc >=5.0.0
  • seaborn >=0.9.0
  • setuptools >=38.4
.github/workflows/pypi.yml actions
  • actions/checkout master composite
  • actions/setup-python v3 composite
  • pypa/gh-action-pypi-publish release/v1 composite
setup.py pypi
environment.yml pypi
  • arviz >=0.12.0
  • graphviz *
  • jax >=0.4.1
  • jaxlib >=0.4.1
  • matplotlib >=3.0.2
  • numpy >=1.18,<=1.23
  • packaging *
  • seaborn >=0.9.0
  • setuptools >=38.4