bboptpy
Powerful and scalable black-box optimization algorithms for Python and C++.
Science Score: 67.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 18 DOI reference(s) in README -
✓Academic publication links
Links to: sciencedirect.com, springer.com, wiley.com, mdpi.com, ieee.org, acm.org, acs.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (11.6%) to scientific vocabulary
Keywords
Repository
Powerful and scalable black-box optimization algorithms for Python and C++.
Basic Info
- Host: GitHub
- Owner: mike-gimelfarb
- License: lgpl-2.1
- Language: C++
- Default Branch: main
- Homepage: https://bboptpy.readthedocs.io/en/latest/
- Size: 2.44 MB
Statistics
- Stars: 7
- Watchers: 2
- Forks: 1
- Open Issues: 0
- Releases: 1
Topics
Metadata Files
README.md
bboptpy
bboptpy is a library of algorithms for the optimization of black-box functions.
Main advantages: - single unified interface for Python with a user-friendly API - faithful reproductions of classical and modern baselines (of which many are not publicly available elsewhere), with SOTA improvements - transparent implementation and reproducibility that makes it easy to build upon.
Full documentation and list of supported algorithms and functions is found here.
Installation
This package can now be installed directly from pip!
pip install bboptpy
Algorithms Supported
The following algorithms are currently fully supported with Python wrappers:
- Univariate:
- Multivariate:
- Unconstrained:
- Adaptive Coordinate Descent (ACD)
- AMaLGaM IDEA
- Basin Hopping
- Controlled Random Search (CRS)
- Covariance Matrix Adaptation Evolutionary Strategy (CMA-ES):
- Differential Evolution (DE):
- Differential Search (DSA)
- Exponential Natural Evolution Strategy (xNES)
- LIPO Search with Max Heuristic and Local Search (MAX-LIPO-TR)
- Novel Self-Adaptive Harmony Search (NSHS)
- Hessian Evolutionary Strategy (HE-ES)
- Self-Adaptive Multi-Population JAYA
- Adaptive Nelder-Mead Method
- Particle Swarm Optimization (PSO):
- Powell's Methods:
- PRAXIS
- Rosenbrock Method
- Unconstrained:
Usage
Univariate Optimization
Simple example to optimize a univariate function:
```python import numpy as np from bboptpy import Brent
function to optimize
def fx(x): return np.sin(x) + np.sin(10 * x / 3)
alg = Brent(mfev=20000, atol=1e-6) sol = alg.optimize(fx, lower=2.7, upper=7.5, guess=np.random.uniform(2.7, 7.5)) print(sol) ```
This will print the following output:
x*: 5.1457349293974861
calls to f: 10
converged: 1
Multivariate Optimization
Simple example to optimize a multivariate function:
```python import numpy as np from bboptpy import ActiveCMAES
function to optimize
def fx(x): return sum((100 * (x2 - x1 ** 2) ** 2 + (1 - x1) ** 2) for x1, x2 in zip(x[:-1], x[1:]))
n = 10 # dimension of problem alg = ActiveCMAES(mfev=10000, tol=1e-4, np=20) sol = alg.optimize(fx, lower=-10 * np.ones(n), upper=10 * np.ones(n), guess=np.random.uniform(-10, 10, size=n)) print(sol) ```
This will print the following output:
x*: 0.999989 0.999999 1.000001 1.000007 1.000020 1.000029 1.000102 1.000183 1.000357 1.000689
objective calls: 6980
constraint calls: 0
B/B constraint calls: 0
converged: yes
Incremental Optimization
The following example illustrates how to run bboptpy optimizers incrementally, returning the control to the Python interpreter between iterations:
```python import numpy as np from bboptpy import ActiveCMAES
function to optimize
def fx(x): return sum((100 * (x2 - x1 ** 2) ** 2 + (1 - x1) ** 2) for x1, x2 in zip(x[:-1], x[1:]))
n = 10 # dimension of problem alg = ActiveCMAES(mfev=10000, tol=1e-4, np=20) alg.initialize(f=fx, lower=-10 * np.ones(n), upper=10 * np.ones(n), guess=np.random.uniform(-10, 10, size=n)) while True: alg.iterate() print(alg.solution()) ```
Citation
To cite this repository, either use the link in the sidebar, or the following bibtext entry:
@software{gimelfarb2024bboptpy,
author = {Gimelfarb, Michael},
license = {LGPL-2.1+},
title = {{bboptpy}},
url = {https://github.com/mike-gimelfarb/bboptpy},
year = {2024}
}
Please also consider citing the original authors of the algorithms you use, whose papers are linked in the supported algorithms section above.
To cite the authors of the individual algorithms, please see the references in the comments heads of the respective C++ source files.
Owner
- Name: Michael Gimelfarb
- Login: mike-gimelfarb
- Kind: user
- Location: Toronto
- Company: University of Toronto
- Website: https://mike-gimelfarb.github.io
- Repositories: 15
- Profile: https://github.com/mike-gimelfarb
Researcher in artificial intelligence and reinforcement learning.
Citation (citation.cff)
# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!
cff-version: 1.2.0
title: 'bboptpy'
message: >-
If you use this software, please cite it using the
metadata from this file.
type: software
authors:
- given-names: Michael
family-names: Gimelfarb
repository-code: 'https://github.com/mike-gimelfarb/bboptpy'
license: LGPL-2.1+
date-released: '2024-10-14'
GitHub Events
Total
- Release event: 1
- Watch event: 2
- Push event: 29
- Create event: 1
Last Year
- Release event: 1
- Watch event: 2
- Push event: 29
- Create event: 1
Packages
- Total packages: 1
-
Total downloads:
- pypi 12 last-month
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 1
- Total maintainers: 1
pypi.org: bboptpy
Powerful and scalable black-box optimization algorithms for Python and C++.
- Homepage: https://github.com/mike-gimelfarb/bboptpy
- Documentation: https://bboptpy.readthedocs.io/
- License: LGPL-2.1 License
-
Latest release: 0.1
published about 1 year ago
Rankings
Maintainers (1)
Dependencies
- pybind11 >=2.11.1
- setuptools >=63.4.1