AdaptiveResonance.jl
AdaptiveResonance.jl: A Julia Implementation of Adaptive Resonance Theory (ART) Algorithms - Published in JOSS (2022)
Science Score: 100.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 12 DOI reference(s) in README and JOSS metadata -
✓Academic publication links
Links to: researchgate.net, joss.theoj.org, zenodo.org -
✓Committers with academic emails
1 of 5 committers (20.0%) from academic institutions -
○Institutional organization owner
-
✓JOSS paper metadata
Published in Journal of Open Source Software
Keywords
Keywords from Contributors
Repository
A Julia package for Adaptive Resonance Theory (ART) algorithms.
Basic Info
- Host: GitHub
- Owner: AP6YC
- License: mit
- Language: Julia
- Default Branch: develop
- Homepage: https://AP6YC.github.io/AdaptiveResonance.jl
- Size: 42 MB
Statistics
- Stars: 28
- Watchers: 4
- Forks: 4
- Open Issues: 7
- Releases: 30
Topics
Metadata Files
README.md
A Julia package for Adaptive Resonance Theory (ART) algorithms.
| Documentation | Testing Status | Coverage | Reference |
|:------------------:|:----------------:|:------------:|:-------------:|
| |
|
|
|
|
|
|
|
|
| Documentation Build | JuliaHub Status | Dependents | Release |
|
|
|
|
|
Please read the documentation for detailed usage and tutorials.
Contents
Overview
Adaptive Resonance Theory (ART) is a neurocognitive theory of how recurrent cellular networks can learn distributed patterns without supervision. As a theory, it provides coherent and consistent explanations of how real neural networks learn patterns through competition, and it predicts the phenomena of attention and expectation as central to learning. In engineering, the theory has been applied to a myriad of algorithmic models for unsupervised machine learning, though it has been extended to supervised and reinforcement learning frameworks. This package provides implementations of many of these algorithms in Julia for both scientific research and engineering applications. Basic installation is outlined in Installation, while a quickstart is provided in Quickstart. Detailed usage and examples are provided in the documentation.
Usage
Installation
This project is distributed as a Julia package, available on JuliaHub, so you must first install Julia on your system. Its usage follows the usual Julia package installation procedure, interactively:
julia-repl
julia> ]
(@v.10) pkg> add AdaptiveResonance
or programmatically:
julia-repl
julia> using Pkg
julia> Pkg.add("AdaptiveResonance")
You may also add the package directly from GitHub to get the latest changes between releases:
julia-repl
julia> ]
(@v.10) pkg> add https://github.com/AP6YC/AdaptiveResonance.jl
Quickstart
Load the module with
julia
using AdaptiveResonance
The stateful information of ART modules are structs with default constructures such as
julia
art = DDVFA()
You can pass module-specific options during construction with keyword arguments such as
julia
art = DDVFA(rho_ub=0.75, rho_lb=0.4)
For more advanced users, options for the modules are contained in Parameters.jl structs.
These options can be passed keyword arguments before instantiating the model:
julia
opts = opts_DDVFA(rho_ub=0.75, rho_lb=0.4)
art = DDVFA(opts)
Train and test the models with train! and classify:
```julia
Unsupervised ART module
art = DDVFA()
Supervised ARTMAP module
artmap = SFAM()
Load some data
trainx, trainy, testx, testy = loadyourdata()
Unsupervised training and testing
train!(art, trainx) yhatart = classify(art, testx)
Supervised training and testing
train!(artmap, trainx, trainy) yhatartmap = classify(art, test_x) ```
train! and classify can accept incremental or batch data, where rows are features and columns are samples.
Unsupervised ART modules can also accommodate simple supervised learning where internal categories are mapped to supervised labels with the keyword argument y:
```julia
Unsupervised ART module
art = DDVFA() train!(art, trainx, y=trainy) ```
These modules also support retrieving the "best-matching unit" in the case of complete mismatch (i.e., the next-best category if the presented sample is completely unrecognized) with the keyword argument get_bmu:
```julia
Get the best-matching unit in the case of complete mismatch
yhatbmu = classify(art, testx, getbmu=true) ```
Implemented Modules
This project has implementations of the following ART (unsupervised) and ARTMAP (supervised) modules:
- ART
- ARTMAP
Because each of these modules is a framework for many variants in the literature, this project also implements these variants by changing their module options. Variants built upon these modules are:
- ART
GammaNormalizedFuzzyART: Gamma-Normalized FuzzyART (variant of FuzzyART).
- ARTMAP
DAM: Default ARTMAP (variant of SFAM).
In addition to these modules, this package contains the following accessory methods:
- ARTSCENE: the ARTSCENE algorithm's multiple-stage filtering process is implemented as
artscene_filter. Each filter stage is implemented internally if further granularity is required. - performance: classification accuracy is implemented as
performance. - complement_code: complement coding is implemented with
complement_code. However, training and classification methods complement code their inputs unless they are passedpreprocessed=true, indicating to the model that this step has already been done. - linear_normalization: the first step to complement coding,
linear_normalizationnormalizes input arrays within[0, 1].
Contributing
If you have a question or concern, please raise an issue. For more details on how to work with the project, propose changes, or even contribute code, please see the Developer Notes in the project's documentation.
In summary:
- Questions and requested changes should all be made in the issues page. These are preferred because they are publicly viewable and could assist or educate others with similar issues or questions.
- For changes, this project accepts pull requests (PRs) from
feature/<my-feature>branches onto thedevelopbranch using the GitFlow methodology. If unit tests pass and the changes are beneficial, these PRs are merged intodevelopand eventually folded into versioned releases throug areleasebranch that is merged with themasterbranch. - The project follows the Semantic Versioning convention of
major.minor.patchincremental versioning numbers. Patch versions are for bug fixes, minor versions are for backward-compatible changes, and major versions are for new and incompatible usage changes.
Acknowledgements
Authors
This package is developed and maintained by Sasha Petrenko with sponsorship by the Applied Computational Intelligence Laboratory (ACIL). The users @aaronpeikert, @hayesall, and @markNZed have graciously contributed their time with reviews and feedback that has greatly improved the project.
Support
This project is supported by grants from the Night Vision Electronic Sensors Directorate, the DARPA Lifelong Learning Machines (L2M) program, Teledyne Technologies, and the National Science Foundation. The material, findings, and conclusions here do not necessarily reflect the views of these entities.
Research was sponsored by the Army Research Laboratory and was accomplished under Cooperative Agreement Number W911NF-22-2-0209. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Laboratory or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.
History
- 7/10/2020 - Begin project.
- 11/3/2020 - Complete baseline modules and tests.
- 2/8/2021 - Formalize usage documentation.
- 10/13/2021 - Initiate GitFlow contribution.
- 5/4/2022 - Acceptance to JOSS.
- 10/11/2022 - v0.6.0
- 12/15/2022 - v0.7.0
- 1/30/2023 - v0.8.0
- 3/21/2024 - v0.8.3
Software
Adaptive Resonance Theory has been developed in theory and in application by many research groups since the theory's conception, and so this project was not developed in a vacuum. This project itself is built upon the wisdom and precedent of decades of previous work in ART in a variety of programming languages. The code in this repository is inspired the following repositories:
- ACIL Organization GitHub
- MATLAB
- DDVFA: Companion MATLAB implementation of distrubuted dual vigilance fuzzy ART.
- DVFA: Companion MATLAB code for Dual Vigilance Fuzzy ART
- iCVI-toolbox: A MATLAB toolbox for incremental/batch cluster validity indices
- CVIFA: Companion MATLAB implementation of validity index-based vigilance test fuzzy ART.
- VAT-FA: Companion MATLAB code for VAT + Fuzzy ART.
- BARTMAP-CF: Companion MATLAB code for BARTMAP-based collaborative filtering
- Python
- NuART-Py: An internal ACIL python package for ART neural networks.
- DVHA: An python implementation of dual vigilance hypersphere ART.
- Boston University's Cognitive and Neural Systems (CNS) Tech Lab
- Nanyang Technological University's Tan Ah Whee
- Bernabé Linares-Barranco
- Marko Tscherepanow's LibTopoART
- National University of Singapore's Lei Meng
- Daniel Tauritz's ART Clearinghouse
Datasets
Boilerplate clustering datasets are periodically used to test, verify, and provide example of the functionality of the package.
- UCI machine learning repository
- Fundamental Clustering Problems Suite (FCPS)
- Nejc Ilc's unsupervised datasets package
- Clustering basic benchmark
License
This software is openly maintained by the ACIL of the Missouri University of Science and Technology under the MIT License.
Citation
This project has a citation file file that generates citation information for the package and corresponding JOSS paper, which can be accessed at the "Cite this repository button" under the "About" section of the GitHub page.
You may also cite this repository with the following BibTeX entry:
bibtex
@article{Petrenko2022,
doi = {10.21105/joss.03671},
url = {https://doi.org/10.21105/joss.03671},
year = {2022},
publisher = {The Open Journal},
volume = {7},
number = {73},
pages = {3671},
author = {Sasha Petrenko and Donald C. Wunsch},
title = {AdaptiveResonance.jl: A Julia Implementation of Adaptive Resonance Theory (ART) Algorithms},
journal = {Journal of Open Source Software}
}
Owner
- Name: Sasha Petrenko
- Login: AP6YC
- Kind: user
- Website: https://ap6yc.github.io/
- Repositories: 48
- Profile: https://github.com/AP6YC
Graduate researcher of applied computational intelligence at the Missouri University of Science and Technology.
JOSS Publication
AdaptiveResonance.jl: A Julia Implementation of Adaptive Resonance Theory (ART) Algorithms
Authors
Tags
ART Adaptive Resonance Theory Machine Learning Clustering Neural NetworksCitation (CITATION.cff)
# CFF version for the document
cff-version: 1.2.0
# Authors list
authors:
- family-names: "Petrenko"
given-names: "Sasha"
orcid: "https://orcid.org/0000-0003-2442-8901"
website: "https://ap6yc.github.io/"
email: "sap625@mst.edu"
alias: "AP6YC"
affiliation: "Missouri University of Science and Technology"
# Repository title and descriptors
title: "AP6YC/AdaptiveResonance.jl"
abstract: "This software is a Julia package for Adaptive Resonance Theory (ART) algorithms."
keywords:
- "ART"
- "Adaptive Resonance Theory"
- "Adaptive Resonance"
identifiers:
- description: "The DOI of the latest AdaptiveResonance.jl Zenodo archive."
type: "doi"
value: "10.5281/zenodo.5748453"
url: "https://doi.org/10.5281/zenodo.5748453"
repository-code: "https://github.com/AP6YC/AdaptiveResonance.jl"
license: "MIT"
institution:
name: "Missouri University of Science and Technology"
# Preferred citation of the JOSS paper
message: "Please cite this software using the metadata from 'preferred-citation'."
preferred-citation:
# Authors list for the JOSS paper
authors:
- family-names: "Petrenko"
given-names: "Sasha"
orcid: "https://orcid.org/0000-0003-2442-8901"
website: "https://ap6yc.github.io/"
email: "sap625@mst.edu"
alias: "AP6YC"
affiliation: "Missouri University of Science and Technology"
- family-names: "Wunsch"
given-names: "Donald"
name-suffix: "II"
orcid: "https://orcid.org/0000-0002-9726-9051"
website: "https://people.mst.edu/faculty/dwunsch/"
email: "dwunsch@mst.edu"
alias: "dwunsch"
affiliation: "Missouri University of Science and Technology"
# Title, DOI, and journal details for the JOSS paper
title: "AdaptiveResonance.jl: A Julia Implementation of Adaptive Resonance Theory (ART) Algorithms"
publisher: "The Open Journal"
journal: "Journal of Open Source Software"
year: 2022
month: 4
volume: 7
number: 73
pages: 3671
type: "article"
identifiers:
- description: "The DOI of the AdaptiveResonance.jl JOSS paper."
type: "doi"
value: "10.21105/joss.03671"
url: "https://doi.org/10.21105/joss.03671"
institution:
name: "Missouri University of Science and Technology"
GitHub Events
Total
- Watch event: 4
- Fork event: 1
Last Year
- Watch event: 4
- Fork event: 1
Committers
Last synced: 7 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Sasha Petrenko | s****5@u****u | 489 |
| github-actions[bot] | 4****] | 9 |
| CompatHelper Julia | c****y@j****g | 4 |
| Alexander L. Hayes | a****r@b****t | 2 |
| Aaron Peikert | a****t@p****e | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 45
- Total pull requests: 68
- Average time to close issues: about 1 month
- Average time to close pull requests: 4 days
- Total issue authors: 4
- Total pull request authors: 4
- Average comments per issue: 1.96
- Average comments per pull request: 0.99
- Merged pull requests: 66
- Bot issues: 0
- Bot pull requests: 6
Past Year
- Issues: 2
- Pull requests: 2
- Average time to close issues: 4 days
- Average time to close pull requests: 24 minutes
- Issue authors: 1
- Pull request authors: 1
- Average comments per issue: 0.0
- Average comments per pull request: 0.5
- Merged pull requests: 2
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- AP6YC (31)
- hayesall (8)
- markNZed (5)
- JuliaTagBot (1)
Pull Request Authors
- AP6YC (65)
- github-actions[bot] (7)
- aaronpeikert (1)
- hayesall (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- julia 2 total
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 31
juliahub.com: AdaptiveResonance
A Julia package for Adaptive Resonance Theory (ART) algorithms.
- Homepage: https://AP6YC.github.io/AdaptiveResonance.jl
- Documentation: https://docs.juliahub.com/General/AdaptiveResonance/stable/
- License: MIT
-
Latest release: 0.8.5
published over 1 year ago
Rankings
Dependencies
- actions/cache v3 composite
- actions/checkout v3 composite
- codecov/codecov-action v3 composite
- coverallsapp/github-action master composite
- julia-actions/julia-buildpkg latest composite
- julia-actions/julia-processcoverage v1 composite
- julia-actions/julia-runtest latest composite
- julia-actions/setup-julia v1 composite
- styfle/cancel-workflow-action 0.11.0 composite
- actions/checkout v2 composite
- julia-actions/setup-julia latest composite
- styfle/cancel-workflow-action 0.9.1 composite
- JuliaRegistries/TagBot v1 composite
- actions/checkout v2 composite
- actions/upload-artifact v1 composite
- openjournals/openjournals-draft-action master composite
- styfle/cancel-workflow-action 0.6.0 composite

