https://github.com/aspuru-guzik-group/phoenics

Phoenics: Bayesian optimization for efficient experiment planning

https://github.com/aspuru-guzik-group/phoenics

Science Score: 33.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: rsc.org, acs.org
  • Committers with academic emails
    1 of 1 committers (100.0%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.4%) to scientific vocabulary

Keywords

bayesian-optimization experiment-planning phoenics self-driving-laboratories
Last synced: 5 months ago · JSON representation

Repository

Phoenics: Bayesian optimization for efficient experiment planning

Basic Info
  • Host: GitHub
  • Owner: aspuru-guzik-group
  • License: apache-2.0
  • Language: Python
  • Default Branch: master
  • Homepage:
  • Size: 17.4 MB
Statistics
  • Stars: 91
  • Watchers: 14
  • Forks: 20
  • Open Issues: 6
  • Releases: 0
Topics
bayesian-optimization experiment-planning phoenics self-driving-laboratories
Created about 8 years ago · Last pushed over 6 years ago
Metadata Files
Readme License

README.md

Phoenics

Phoenics is an open source optimization algorithm combining ideas from Bayesian optimization with Bayesian Kernel Density estimation [1]. It performs global optimization on expensive to evaluate objectives, such as physical experiments or demanding computations. Phoenics supports sequential and batch optimizations and allows for the simultaneous optimization of multiple objectives via the Chimera scalarizing function [2].

Check out the examples folder for detailed descriptions and code examples for:

| Example | Link | |:--------|:-----| | Sequential optimization | examples/optimization_sequential | | Parallelizable batch optimization | examples/optimization_parallel | | Periodic parameter support | examples/optimizationperiodicparameters | | Multi-objective optimization | examples/optimizationmultipleobjectives |

More elaborate applications of Phoenics and Chimera are listed below

| Application | Link | |:------------------------------------|:-----------------------| | Auto-calibration of a virtual robot | examples/applicationrobotcalibration |

Chimera

Chimera is a general purpose achievement scalarizing function for multi-objective optimization. User preferences regarding the objectives are expected in terms of an importance hierarchy, as well as relative tolerances on each objective indicating what level of degradation is acceptable. Chimera is integrated into Phoenics, but also available for download as a wrapper for other optimization methods (see chimera).

Installation

You can install Phoenics via pip

apt-get install python-pip pip install phoenics

or by creating a conda environment from the provided environment file

conda env create -f environment.yml source activate phoenics

Alternatively, you can also choose to build Phoenics from source by cloning this repository

git clone https://github.com/aspuru-guzik-group/phoenics.git

Requirements

This code has been tested with Python 3.6 and uses * cython 0.27.3 * json 2.0.9 * numpy 1.13.1 * scipy 0.19.1

Phoenics can construct its probabilistic model with two different probabilistic modeling libraries: PyMC3 and Edward. Depending on your preferences, you will either need * pymc3 3.2 * theano 1.0.1

or * edward 1.3.5 * tensorflow 1.4.1

Check out the environment.yml file for more details.

Using Phoenics

Phoenics is designed to suggest new parameter points based on prior observations. The suggested parameters can then be passed on to objective evaluations (experiments or involved computation). As soon as the objective values have been determined for a set of parameters, these new observations can again be passed on to Phoenics to request new, more informative parameters.

```python from phoenics import Phoenics

create an instance from a configuration file

configfile = 'config.json' phoenics = Phoenics(configfile)

request new parameters from a set of observations

params = phoenics.choose(observations = observations) `` Detailed examples for specific applications are presented in theexamples` folder.

Using Chimera

Chimera is integrated into Phoenics, but also available as a stand-alone wrapper for other single-objective optimization algorithms. The Chimera wrapper allows to cast a set of objectives for a number of observations into a single objective value for each observation, enabling single-objective optimization algorithms to solve the multi-objective optimization problem. The usage of Chimera is outlined below on an example with four objective functions

```python from chimera import Chimera

define tolerances in descending order of importance

tolerances = [0.25, 0.1, 0.25, 0.05]

create Chimera instance

chimera = Chimera(tolerances)

cast objectives of shape [numobservations, numobjectives]

into single objective vector [num_observations, 1]

singleobjectives = chimera.scalarizeobjectives(objectives)

```

Note: Phoenics automatically employs Chimera when the configuration contains more than one objective.

Disclaimer

Note: This repository is under construction! We hope to add further details on the method, instructions and more examples in the near future.

Experiencing problems?

Please create a new issue and describe your problem in detail so we can fix it.

References

[1] Häse, F., Roch, L. M., Kreisbeck, C., & Aspuru-Guzik, A. Phoenics: A Bayesian Optimizer for Chemistry. ACS central science 4.6 (2018): 1134-1145.

[2] Häse, F., Roch, L. M., & Aspuru-Guzik, A. Chimera: enabling hierarchy based multi-objective optimization for self-driving laboratories. Chemical Science (2018).

Owner

  • Name: Aspuru-Guzik group repo
  • Login: aspuru-guzik-group
  • Kind: organization

GitHub Events

Total
  • Watch event: 4
  • Fork event: 1
Last Year
  • Watch event: 4
  • Fork event: 1

Committers

Last synced: over 2 years ago

All Time
  • Total Commits: 82
  • Total Committers: 1
  • Avg Commits per committer: 82.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
flo f****e@g****u 82
Committer Domains (Top 20 + Academic)

Dependencies

environment.yml pypi
  • cma ==2.5.3
  • cython ==0.27.3
  • edward ==1.3.5
  • tensorflow ==1.4.1