ECabc
ECabc: A feature tuning program focused on Artificial Neural Network hyperparameters - Published in JOSS (2019)
Science Score: 98.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 4 DOI reference(s) in README and JOSS metadata -
✓Academic publication links
Links to: joss.theoj.org -
✓Committers with academic emails
3 of 12 committers (25.0%) from academic institutions -
✓Institutional organization owner
Organization ecrl has institutional domain (sites.uml.edu) -
✓JOSS paper metadata
Published in Journal of Open Source Software
Keywords
Keywords from Contributors
Repository
Artificial Bee Colony for generic feature tuning
Basic Info
Statistics
- Stars: 12
- Watchers: 2
- Forks: 7
- Open Issues: 0
- Releases: 7
Topics
Metadata Files
README.md
ECabc: optimization algorithm for tuning user-defined parametric functions
ECabc is an open source Python package used to tune parameters for user-supplied functions based on the Artificial Bee Colony by D. Karaboğa. ECabc optimizes user supplied functions, or fitness functions, using a set of variables that exist within a search space. The bee colony consists of three types of bees: employers, onlookers and scouts. An employer bee exploits a solution comprised of a permutation of the variables in the search space, and evaluates the viability of the solution. An onlooker bee chooses an employer bee with an optimal solution and searches for new solutions near them. The scout bee, a variant of the employer bee, will search for a new solution if it has stayed too long at its current solution.
Research applications
While it has several applications, ECabc has been successfully used by the Energy and Combustion Research Laboratory (ECRL) at the University of Massachusetts Lowell to tune the hyperparameters of ECNet, an open source Python package tailored to predicting fuel properties. ECNet provides scientists an open source tool for predicting key fuel properties of potential next-generation biofuels, reducing the need for costly fuel synthesis and experimentation. By increasing the accuracy of ECNet and similar models efficiently, ECabc helps to provide a higher degree of confidence in discovering new, optimal fuels. A single run of ECabc on ECNet yielded a lower average root mean square error (RMSE) for cetane number (CN) and yield sooting index (YSI) when compared to the RMSE generated by a year of manual tuning. While the manual tuning generated an RMSE of 10.13, the ECabc was able to yield an RMSE of 8.06 in one run of 500 iterations.
Installation
Prerequisites:
- Have python 3.X installed
- Have the ability to install python packages
Method 1: pip
If you are working in a Linux/Mac environment:
sudo pip install ecabc
Alternatively, in a Windows or virtualenv environment:
pip install ecabc
To update your version of ECabc to the latest release version, use
pip install --upgrade ecabc
Note: if multiple Python releases are installed on your system (e.g. 2.7 and 3.7), you may need to execute the correct version of pip. For Python 3.X, change "pip install ecabc" to "pip3 install ecabc".
Method 2: From source
- Download the ECabc repository, navigate to the download location on the command line/terminal, and execute:
pip install .
There are currently no additional dependencies for ECabc.
Usage
To start using ECabc, you need a couple items: - a fitness function (cost function) to optimize - parameters used by the fitness function
For example, let's define a fitness function to minimize the sum of three integers:
```python def minimize_integers(integers):
return sum(integers)
```
Your fitness function must accept a list from ECabc. The list values represent the current "food source", i.e. parameter values, being exploited by a given bee.
Now that we have our fitness function, let's import the ABC object from ECabc, initialize the artificial bee colony, and add our parameters:
```python from ecabc import ABC
def minimize_integers(integers):
return sum(integers)
abc = ABC(10, minimizeintegers) abc.addparam(0, 10, name='Int1') abc.addparam(0, 10, name='Int2') abc.addparam(0, 10, name='Int_3') ```
Here we initialize the colony with 10 employer bees, supply our fitness function, and add our parameters. Parameters are added with minimum/maximum values for its search space and optionally a name. By default, parameter mutations (searching a neighboring food source) will not exceed the specified parameter bounds [minval, maxval]; if this limitation is not desired, supply the "restrict=False" argument:
python
abc.add_param(0, 10, restrict=False, name='Int_1')
Once we have created our colony and added our parameters, we then need to "initialize" the colony's bees:
```python from ecabc import ABC
def minimize_integers(integers):
return sum(integers)
abc = ABC(10, minimizeintegers) abc.addparam(0, 10, name='Int1') abc.addparam(0, 10, name='Int2') abc.addparam(0, 10, name='Int_3') abc.initialize() ```
Initializing the colony's bees deploys employer bees (in this example, 10 bees) to random food sources (random parameter values are generated), their fitness is evaluated (in this example, lowest sum is better), and onlooker bees (equal to the number of employers) are deployed proportionally to neighboring food sources of well-performing bees.
We then send the colony through a predetermined of "search cycles":
```python from ecabc import ABC
def minimize_integers(integers):
return sum(integers)
abc = ABC(10, minimizeintegers) abc.addparam(0, 10, name='Int1') abc.addparam(0, 10, name='Int2') abc.addparam(0, 10, name='Int_3') abc.initialize() for _ in range(10): abc.search() ```
A search cycle consists of: - each bee searches a neighboring food source (performs a mutation on one parameter) - if the food source produces a better fitness than the bee's current food source, move there - otherwise, the bee stays at its current food source - if the bee has stayed for (NE * D) cycles (NE = number of employers, D = dimension of the function, 3 in our example), abandon the food source - if the bee is an employer, go to a new random food source - if the bee is an onlooker, go to a food source neighboring a well-performing bee
We can access the colony's average fitness score, average fitness function return value, best fitness score, best fitness function return value and best parameters at any time:
python
print(abc.average_fitness)
print(abc.average_ret_val)
print(abc.best_fitness)
print(abc.best_ret_val)
print(abc.best_params)
ECabc can utilize multiple CPU cores for concurrent processing:
python
abc = ABC(10, minimize_integers, num_processes=8)
Tying everything together, we have:
```python from ecabc import ABC
def minimize_integers(integers):
return sum(integers)
abc = ABC(10, minimizeintegers) abc.addparam(0, 10, name='Int1') abc.addparam(0, 10, name='Int2') abc.addparam(0, 10, name='Int3') abc.initialize() for _ in range(10): abc.search() print('Average fitness: {}'.format(abc.averagefitness)) print('Average obj. fn. return value: {}'.format(abc.averageretval)) print('Best fitness score: {}'.format(abc.bestfitness)) print('Best obj. fn. return value: {}'.format(abc.bestretval)) print('Best parameters: {}\n'.format(abc.bestparams)) ```
Running this script produces:
``` Average fitness: 0.08244866244866243 Average obj. fn. return value: 11.65 Best fitness score: 0.125 Best obj. fn. return value: 7 Best parameters: {'Int1': 4, 'Int2': 3, 'Int_3': 0}
Average fitness: 0.0885855117105117 Average obj. fn. return value: 10.8 Best fitness score: 0.125 Best obj. fn. return value: 7 Best parameters: {'Int1': 4, 'Int2': 3, 'Int_3': 0}
Average fitness: 0.10361832611832611 Average obj. fn. return value: 9.4 Best fitness score: 0.16666666666666666 Best obj. fn. return value: 5 Best parameters: {'Int1': 2, 'Int2': 3, 'Int_3': 0}
Average fitness: 0.11173502151443326 Average obj. fn. return value: 8.8 Best fitness score: 0.2 Best obj. fn. return value: 4 Best parameters: {'Int1': 0, 'Int2': 0, 'Int_3': 4}
Average fitness: 0.12448879551820731 Average obj. fn. return value: 7.95 Best fitness score: 0.2 Best obj. fn. return value: 4 Best parameters: {'Int1': 1, 'Int2': 3, 'Int_3': 0}
Average fitness: 0.1767694805194805 Average obj. fn. return value: 6.7 Best fitness score: 1.0 Best obj. fn. return value: 0 Best parameters: {'Int1': 0, 'Int2': 0, 'Int_3': 0}
Average fitness: 0.183255772005772 Average obj. fn. return value: 6.3 Best fitness score: 1.0 Best obj. fn. return value: 0 Best parameters: {'Int1': 0, 'Int2': 0, 'Int_3': 0}
Average fitness: 0.20172799422799423 Average obj. fn. return value: 5.65 Best fitness score: 1.0 Best obj. fn. return value: 0 Best parameters: {'Int1': 0, 'Int2': 0, 'Int_3': 0}
Average fitness: 0.23827561327561328 Average obj. fn. return value: 4.95 Best fitness score: 1.0 Best obj. fn. return value: 0 Best parameters: {'Int1': 0, 'Int2': 0, 'Int_3': 0}
Average fitness: 0.28456349206349213 Average obj. fn. return value: 4.35 Best fitness score: 1.0 Best obj. fn. return value: 0 Best parameters: {'Int1': 0, 'Int2': 0, 'Int_3': 0} ```
To run this script yourself, head over to our examples directory.
Contributing, Reporting Issues and Other Support:
To contribute to ECabc, make a pull request. Contributions should include tests for new features added, as well as extensive documentation.
To report problems with the software or feature requests, file an issue. When reporting problems, include information such as error messages, your OS/environment and Python version.
For additional support/questions, contact Sanskriti Sharma (SanskritiSharma@student.uml.edu), Hernan Gelaf-Romer (HernanGelafromer@student.uml.edu), or Travis Kessler (Travis_Kessler@student.uml.edu).
Owner
- Name: UMass Lowell Energy and Combustion Research Laboratory
- Login: ecrl
- Kind: organization
- Email: hunter_mack@uml.edu
- Location: Lowell, MA
- Website: https://sites.uml.edu/hunter-mack/
- Repositories: 6
- Profile: https://github.com/ecrl
Open source software used to further alternative fuel research
JOSS Publication
ECabc: A feature tuning program focused on Artificial Neural Network hyperparameters
Authors
Energy and Combustion Research Laboratory, University of Massachusetts Lowell, Lowell, MA 01854, U.S.A.
Energy and Combustion Research Laboratory, University of Massachusetts Lowell, Lowell, MA 01854, U.S.A.
Tags
artificial bee colony hyperparameter optimization machine learning artificial neural networksGitHub Events
Total
Last Year
Committers
Last synced: 5 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Hernan Romer | n****3@g****m | 361 |
| Sanskriti Sharma | s****a@g****m | 38 |
| tjkessler | t****r@g****m | 20 |
| Hernan | h****r@s****u | 6 |
| Kyle Niemeyer | k****r@g****m | 3 |
| KOLANICH | K****H | 3 |
| atasever | u****r@g****m | 1 |
| Daniel S. Katz | d****z@i****g | 1 |
| Ariel Rokem | a****m@g****m | 1 |
| Arfon Smith | a****n | 1 |
| Hernan Romer | h****n@M****e | 1 |
| GelafRomer | H****r@s****u | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 4 months ago
All Time
- Total issues: 23
- Total pull requests: 32
- Average time to close issues: 29 days
- Average time to close pull requests: 2 months
- Total issue authors: 4
- Total pull request authors: 9
- Average comments per issue: 1.87
- Average comments per pull request: 0.59
- Merged pull requests: 23
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- hgromer (11)
- KOLANICH (5)
- sanskriti-s (4)
- tjkessler (3)
Pull Request Authors
- hgromer (8)
- KOLANICH (8)
- tjkessler (5)
- sanskriti-s (4)
- kyleniemeyer (3)
- arfon (1)
- atasever (1)
- danielskatz (1)
- arokem (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 90 last-month
- Total dependent packages: 1
- Total dependent repositories: 1
- Total versions: 24
- Total maintainers: 2
pypi.org: ecabc
Artificial bee colony for function parameter optimization
- Homepage: https://github.com/ecrl/ecabc
- Documentation: https://ecabc.readthedocs.io/
- License: MIT License
-
Latest release: 3.0.1
published over 2 years ago
Rankings
Dependencies
- actions/checkout v3 composite
- actions/setup-python v3 composite
- pypa/gh-action-pypi-publish 27b31702a0e7fc50959f5ad993c78deac1bdfc29 composite
- actions/checkout v3 composite
- actions/setup-python v3 composite
- pavelzw/pytest-action v2 composite
