Science Score: 54.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (14.4%) to scientific vocabulary
Keywords
automl
bayesian-optimization
hyperparameter-optimization
machine-learning
neural-networks
python
tensorflow
Last synced: 4 months ago
·
JSON representation
·
Repository
Bayesian Optimization with Density-Ratio Estimation
Basic Info
- Host: GitHub
- Owner: ltiao
- Language: Python
- Default Branch: master
- Homepage: https://tiao.io/publication/bore-2/
- Size: 9 MB
Statistics
- Stars: 23
- Watchers: 2
- Forks: 3
- Open Issues: 2
- Releases: 0
Topics
automl
bayesian-optimization
hyperparameter-optimization
machine-learning
neural-networks
python
tensorflow
Created over 5 years ago
· Last pushed about 3 years ago
Metadata Files
Readme
Changelog
Contributing
License
Citation
Authors
README.rst
=======================================================
BORE: Bayesian Optimization as Density-Ratio Estimation
=======================================================
.. image:: https://img.shields.io/pypi/v/bore.svg
:target: https://pypi.python.org/pypi/bore
.. image:: https://img.shields.io/travis/ltiao/bore.svg
:target: https://travis-ci.org/ltiao/bore
.. image:: https://readthedocs.org/projects/bore/badge/?version=latest
:target: https://bore.readthedocs.io/en/latest/?badge=latest
:alt: Documentation Status
.. image:: https://pyup.io/repos/github/ltiao/bore/shield.svg
:target: https://pyup.io/repos/github/ltiao/bore/
:alt: Updates
A minimalistic implementation of BORE: Bayesian Optimization as Density-Ratio Estimation [1]_
in Python 3 and TensorFlow 2.
|featured|
Please note this repository is not being actively developed. For a more feature-complete and well-supported implementation, please check out the BORE Searcher in the `Syne Tune `_ framework from `AWS Labs `_, which has support for variants based on numerous classifiers (XGBoost, Random Forests, etc.)
Getting Started
---------------
Install with ``pip``:
.. code-block:: bash
$ pip install "bore[tf]"
With support for GPU accelaration:
.. code-block:: bash
$ pip install "bore[tf-gpu]"
With support for HpBandSter plugin:
.. code-block:: bash
$ pip install "bore[tf,hpbandster]"
Usage/Examples
--------------
This example implements an instantiation of BORE based on a multi-layer perceptron (i.e. a fully-connected feed-forward neural network) classifier.
First we build and compile the classifier model using ``MaximizableSequential``:
.. code-block:: python
from bore.models import MaximizableSequential
from tensorflow.keras.layers import Dense
# build model
classifier = MaximizableSequential()
classifier.add(Dense(16, activation="relu"))
classifier.add(Dense(16, activation="relu"))
classifier.add(Dense(1, activation="sigmoid"))
# compile model
classifier.compile(optimizer="adam", loss="binary_crossentropy")
This syntax should be familiar to anyone who has used a high-level neural network library such as Keras. In fact, ``MaximizableSequential`` is simply a subclass of the ``Sequential`` class from Keras. More specifically, in addition to inheriting the usual functionalities, it provides the ``argmax`` method which finds the input at which the network output is maximized.
Using this method, the standard optimization loop can be implemented as follows:
.. code-block:: python
import numpy as np
features = []
targets = []
# initial design
features.extend(features_init)
targets.extend(targets_init)
for i in range(num_iterations):
# construct classification problem
X = np.vstack(features)
y = np.hstack(targets)
tau = np.quantile(y, q=0.25)
z = np.less(y, tau)
# update classifier
classifier.fit(X, z, epochs=200, batch_size=64)
# suggest new candidate
x_next = classifier.argmax(method="L-BFGS-B", num_start_points=3, bounds=bounds)
# evaluate blackbox function
y_next = blackbox.evaluate(x_next)
# update dataset
features.append(x_next)
targets.append(y_next)
For complete end-to-end scripts and to reproduce our results, take a look at the associated `experiments `_ repository.
Features
--------
* BORE-MLP: BORE based on a multi-layer perceptron (MLP) classifier
* Provides higher-order functions that leverage automatic differentiation to transform Keras models into functions that can easily be optimized by methods in SciPy, not least multi-started quasi-Newton hill-climbing methods such as L-BFGS.
Roadmap
-------
* Integration with the `Optuna `_ framework by implementing a `Sampler `_ plugin.
Authors
-------
Lead Developers:
++++++++++++++++
+------------------+----------------------------+
| |tiao| | |klein| |
+------------------+----------------------------+
| Louis Tiao | Aaron Klein |
+------------------+----------------------------+
| https://tiao.io/ | https://aaronkl.github.io/ |
+------------------+----------------------------+
Reference
---------
.. [1] L. Tiao, A. Klein, C. Archambeau, E. V. Bonilla, M. Seeger, and F. Ramos.
`BORE: Bayesian Optimization by Density-Ratio Estimation `_.
In Proceedings of the 38th International Conference on Machine Learning (ICML2021),
Virtual (Online), July 2021.
Cite:
+++++
.. code-block::
@inproceedings{tiao2021-bore,
title={{B}ayesian {O}ptimization by {D}ensity-{R}atio {E}stimation},
author={Tiao, Louis and Klein, Aaron and Archambeau, C\'{e}dric and Bonilla, Edwin V and Seeger, Matthias and Ramos, Fabio},
booktitle={Proceedings of the 38th International Conference on Machine Learning (ICML2021)},
address={Virtual (Online)},
year={2021},
month={July}
}
License
-------
MIT License
Copyright (c) 2021, Louis C. Tiao
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
.. |tiao| image:: http://gravatar.com/avatar/d8b59298191057fa164edf80f0743fcc?s=120
:align: middle
.. |klein| image:: https://via.placeholder.com/120.png
:align: middle
.. |featured| image:: docs/_static/header_1000x618.png
:align: middle
Owner
- Name: Louis Tiao
- Login: ltiao
- Kind: user
- Location: New York, New York
- Company: @Facebook
- Website: https://tiao.io
- Twitter: louistiao
- Repositories: 114
- Profile: https://github.com/ltiao
machine learning researcher
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Tiao"
given-names: "Louis"
- family-names: "Klein"
given-names: "Aaron"
title: "BORE: Bayesian Optimization by Density-Ratio Estimation"
version: 2.0.4
date-released: 2020-12-18
url: "https://github.com/ltiao/bore"
preferred-citation:
type: conference-paper
authors:
- family-names: Tiao
given-names: Louis
- family-names: Klein
given-names: Aaron
title: "BORE: Bayesian Optimization by Density-Ratio Estimation"
year: 2021
collection-title: "Proceedings of the 38th International Conference on Machine Learning"
editors:
- family-names: Meila
given-names: Marina
- family-names: Zhang
given-names: Tong
publisher:
- name: PMLR
conference:
- name: International Conference on Machine Learning (ICML)
date-start: 2021-07-18
date-end: 2021-07-24
month: 7
start: 10289 # First page number
end: 10300 # Last page number
volume: 139
GitHub Events
Total
Last Year
Committers
Last synced: almost 3 years ago
All Time
- Total Commits: 175
- Total Committers: 2
- Avg Commits per committer: 87.5
- Development Distribution Score (DDS): 0.143
Top Committers
| Name | Commits | |
|---|---|---|
| Louis Tiao | l****o@g****m | 150 |
| Louis Tiao | 1****o@u****m | 25 |
Issues and Pull Requests
Last synced: 7 months ago
All Time
- Total issues: 2
- Total pull requests: 9
- Average time to close issues: about 2 months
- Average time to close pull requests: less than a minute
- Total issue authors: 2
- Total pull request authors: 2
- Average comments per issue: 4.0
- Average comments per pull request: 0.11
- Merged pull requests: 7
- Bot issues: 0
- Bot pull requests: 2
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- rgap (1)
- ltiao (1)
Pull Request Authors
- ltiao (7)
- dependabot[bot] (2)
Top Labels
Issue Labels
enhancement (1)
Pull Request Labels
dependencies (2)
Packages
- Total packages: 1
-
Total downloads:
- pypi 85 last-month
- Total dependent packages: 0
- Total dependent repositories: 1
- Total versions: 11
- Total maintainers: 1
pypi.org: bore
Bayesian Optimization by Density-Ratio Estimation
- Homepage: https://github.com/ltiao/bore
- Documentation: https://bore.readthedocs.io/
- License: MIT license
-
Latest release: 1.5.0
published over 4 years ago
Rankings
Dependent packages count: 10.0%
Stargazers count: 12.9%
Forks count: 15.3%
Average: 17.6%
Dependent repos count: 21.7%
Downloads: 28.1%
Maintainers (1)
Last synced:
5 months ago
Dependencies
requirements.txt
pypi
- ConfigSpace ==0.4.18
- Cython ==0.29.23
- numpy ==1.19.5
- scipy ==1.7.1
requirements_dev.txt
pypi
- Sphinx ==2.4.2 development
- bump2version ==1.0.0 development
- coverage ==5.0.3 development
- flake8 ==3.7.9 development
- nbsphinx * development
- numpydoc * development
- pillow * development
- pytest ==5.3.5 development
- pytest-runner ==5.2 development
- sphinx-gallery >=0.7.0 development
- sphinx_bootstrap_theme * development
- tox ==3.14.5 development
- twine ==3.1.1 development
- watchdog ==0.10.2 development
- wheel ==0.34.2 development