chaogatenn
Code for gradient based optimization of chaogates paper
https://github.com/nonlinearartificialintelligencelab/chaogatenn
Science Score: 62.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: sciencedirect.com -
○Academic email domains
-
✓Institutional organization owner
Organization nonlinearartificialintelligencelab has institutional domain (nail.sciences.ncsu.edu) -
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.1%) to scientific vocabulary
Keywords
Repository
Code for gradient based optimization of chaogates paper
Basic Info
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
chaogatenn
Code for gradient based optimization of chaogates paper
This code base is uv compatible and pip installable.
Authors
Anil Radhakrishnan, Sudeshna Sinha, K. Murali ,William L. Ditto
Link to paper
Key Results
- A gradient-based optimization framework for tuning chaotic systems to match predefined logic gate behavior.
- Extension of the framework to show simultaneous optimization of multiple logic gates for logic circuits like the half-adder.
- A demonstration and comparison of the reconfigurability of chaogates across nonlinear map configurations, showing the efficacy of using the same nonlinear system to perform multiple gate operations through parameter tuning
Installation
We recommend using uv to manage python and install the package.
Then, you can simply git clone the repository and run,
bash
uv pip install .
to install the package with all dependencies.
Usage
The notebooks in the nbs illustrate different extensions and tests of the chaogates framework.
The scripts in the scripts directory are the same as the Diff_chao_config notebooks but with argparsing for easy command line usage for use in batch processing.
To run the scripts, you can use the uv run command to run the scripts in the scripts directory.
The bash scripts in the scripts directory can be used to run the scripts in batch mode.
The analysis of the statistical run results can be done using the analysis amd plotter notebooks in the nbs directory.
Code References
Owner
- Name: Non Linear Artificial Intelligence Lab
- Login: NonlinearArtificialIntelligenceLab
- Kind: organization
- Location: United States of America
- Website: nail.sciences.ncsu.edu
- Repositories: 1
- Profile: https://github.com/NonlinearArtificialIntelligenceLab
Citation (CITATION.cff)
cff-version: 1.2.0
title: Gradient based Optimization of Chaogates
message: Please cite this software using these metadata.
type: software
authors:
- given-names: Anil
family-names: Radhakrishnan
email: aradhak5@ncsu.edu
affiliation: North Carolina State University
orcid: 'https://orcid.org/0000-0002-8084-9527'
- given-names: Sudeshna
family-names: Sinha
email: sudeshna@iisermohali.ac.in
affiliation: >-
Indian Institute of Science Education and Research
Mohali
orcid: 'https://orcid.org/0000-0002-1364-5276'
- given-names: Krishna
family-names: Murali
email: kmurali@annauniv.edu
affiliation: >-
Department of Physics, Anna University,
Chennai 600025, India
orcid: 'https://orcid.org/0000-0001-8055-1117'
- given-names: William
name-particle: L
family-names: Ditto
email: wditto@ncsu.edu
affiliation: North Carolina State Universty
orcid: 'https://orcid.org/0000-0002-7416-8012'
repository-code: 'https://github.com/NonlinearArtificialIntelligenceLab/ChaoGateNN'
abstract: >-
We present a method for configuring chaogates to replicate standard Boolean logic gate behavior using gradient-based
optimization. By defining a differentiable formulation of the chaogate encoding, we optimize its tunable parameters to
reconfigure the chaogate for standard logic gate functions. This novel approach allows us to bring the well-established
tools of machine learning to optimizing chaogates without the cost of high parameter count neural networks. We further
extend this approach to the simultaneous optimization of multiple gates for tuning logic circuits. Experimental results
demonstrate the viability of this technique across different nonlinear systems and configurations, offering a pathway to
automate parameter discovery for nonlinear computational devices.
keywords:
- Machine Learning
- Nonlinear
- chaogate
- optimization
license: MIT
GitHub Events
Total
- Public event: 1
- Push event: 2
Last Year
- Public event: 1
- Push event: 2
Dependencies
- beartype >=0.18.5
- diffrax >=0.6.0
- equinox >=0.11.5
- ipython >=8.26.0
- jax [cuda12]>=0.4.31
- jupyter >=1.1.0
- jupyterlab >=4.2.5
- matplotlib >=3.9.2
- optax >=0.2.3
- ruff >=0.6.3
- tqdm >=4.66.5
- 144 dependencies