https://github.com/jbrea/bayesianoptimization.jl

Bayesian optimization for Julia

https://github.com/jbrea/bayesianoptimization.jl

Science Score: 23.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org, ieee.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (7.8%) to scientific vocabulary

Keywords

bayesian-methods bayesian-optimization gaussian-processes julia machine-learning optimization

Keywords from Contributors

surrogate differential-equations julialang trust-region-methods nlpmodels nonlinear-programming nonlinear-equations raytracer pde surrogate-models
Last synced: 4 months ago · JSON representation

Repository

Bayesian optimization for Julia

Basic Info
  • Host: GitHub
  • Owner: jbrea
  • License: other
  • Language: Julia
  • Default Branch: master
  • Homepage:
  • Size: 175 KB
Statistics
  • Stars: 97
  • Watchers: 5
  • Forks: 15
  • Open Issues: 7
  • Releases: 8
Topics
bayesian-methods bayesian-optimization gaussian-processes julia machine-learning optimization
Created over 7 years ago · Last pushed over 1 year ago
Metadata Files
Readme License

README.md

BayesianOptimization

Lifecycle<!-- Lifecycle Lifecycle Lifecycle Lifecycle Lifecycle --> Build Status Build status codecov.io

Usage

```julia using BayesianOptimization, GaussianProcesses, Distributions

f(x) = sum((x .- 1).^2) + randn() # noisy function to minimize

Choose as a model an elastic GP with input dimensions 2.

The GP is called elastic, because data can be appended efficiently.

model = ElasticGPE(2, # 2 input dimensions mean = MeanConst(0.),
kernel = SEArd([0., 0.], 5.), logNoise = 0., capacity = 3000) # the initial capacity of the GP is 3000 samples. set_priors!(model.mean, [Normal(1, 2)])

Optimize the hyperparameters of the GP using maximum a posteriori (MAP) estimates every 50 steps

modeloptimizer = MAPGPOptimizer(every = 50, noisebounds = [-4, 3], # bounds of the logNoise kernbounds = [[-1, -1, 0], [4, 4, 10]], # bounds of the 3 parameters GaussianProcesses.getparamnames(model.kernel) maxeval = 40) opt = BOpt(f, model, UpperConfidenceBound(), # type of acquisition modeloptimizer,
[-5., -5.], [5., 5.], # lowerbounds, upperbounds
repetitions = 5, # evaluate the function for each input 5 times maxiterations = 100, # evaluate at 100 input positions sense = Min, # minimize the function acquisitionoptions = (method = :LDLBFGS, # run optimization of acquisition function with NLopts :LDLBFGS method restarts = 5, # run the NLopt method from 5 random initial conditions each time. maxtime = 0.1, # run the NLopt method for at most 0.1 second each time maxeval = 1000), # run the NLopt methods for at most 1000 iterations (for other options see https://github.com/JuliaOpt/NLopt.jl) verbosity = Progress)

result = boptimize!(opt) ```

Resume optimization

To continue the optimization, one can call boptimize!(opt) multiple times. julia result = boptimize!(opt) # first time (includes initialization) result = boptimize!(opt) # restart maxiterations!(opt, 50) # set maxiterations for the next call result = boptimize!(opt) # restart again

(Warm-)start with some known function values

By default, the first 5*length(lowerbounds) input points are sampled from a Sobol sequence. If instead one has already some function values available and wants to skip the initialization with the Sobol sequence, one can update the model with the available data and set initializer_iterations = 0. For example (continuing the above example after setting the modeloptimizer). ```julia x = [rand(2) for _ in 1:20] y = -f.(x) append!(model, hcat(x...), y)

opt = BOpt(f, model, UpperConfidenceBound(), modeloptimizer,
[-5., -5.], [5., 5.], maxiterations = 100, sense = Min, initializer_iterations = 0 )

result = boptimize!(opt) ```

This package exports * BOpt, boptimize!, optimize * acquisition types: ExpectedImprovement, ProbabilityOfImprovement, UpperConfidenceBound, ThompsonSamplingSimple, MutualInformation * scaling of standard deviation in UpperConfidenceBound: BrochuBetaScaling, NoBetaScaling * GP hyperparameter optimizer: MAPGPOptimizer, NoModelOptimizer * Initializer: ScaledSobolIterator, ScaledLHSIterator * optimization sense: Min, Max * verbosity levels: Silent, Timings, Progress * helper: maxduration!, maxiterations!

Use the REPL help, e.g. ?Bopt, to get more information.

Review papers on Bayesian optimization

Similar Projects

BayesOpt is a wrapper of the established BayesOpt toolbox written in C++.

Dragonfly is a feature-rich package for scalable Bayesian optimization written in Python. Use it in Julia with PyCall.

Owner

  • Login: jbrea
  • Kind: user

GitHub Events

Total
  • Watch event: 6
Last Year
  • Watch event: 6

Committers

Last synced: 5 months ago

All Time
  • Total Commits: 67
  • Total Committers: 9
  • Avg Commits per committer: 7.444
  • Development Distribution Score (DDS): 0.343
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Johanni Brea j****a@u****m 44
Samuel Belko s****o@p****m 10
Pawel Latawiec p****c@h****m 4
Pawel p****l@m****m 3
github-actions[bot] 4****]@u****m 2
CompatHelper Julia c****y@j****g 1
Julia TagBot 5****t@u****m 1
RohitRathore1 r****5@g****m 1
Tamas K. Papp t****p@g****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 23
  • Total pull requests: 15
  • Average time to close issues: 24 days
  • Average time to close pull requests: 14 days
  • Total issue authors: 15
  • Total pull request authors: 9
  • Average comments per issue: 2.3
  • Average comments per pull request: 1.87
  • Merged pull requests: 10
  • Bot issues: 0
  • Bot pull requests: 5
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • platawiec (3)
  • samuelbelko (3)
  • ngiann (2)
  • ngphuoc (2)
  • fredcallaway (2)
  • patwa67 (2)
  • vmoens (1)
  • Alexander-Barth (1)
  • jbrea (1)
  • rfourquet (1)
  • robsmith11 (1)
  • JuliaTagBot (1)
  • smickusGT (1)
  • mohamed82008 (1)
  • tpapp (1)
Pull Request Authors
  • github-actions[bot] (5)
  • platawiec (2)
  • samuelbelko (2)
  • DavidAfonsoValente (1)
  • RohitRathore1 (1)
  • tpapp (1)
  • JuliaTagBot (1)
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • julia 151 total
  • Total dependent packages: 2
  • Total dependent repositories: 0
  • Total versions: 8
juliahub.com: BayesianOptimization

Bayesian optimization for Julia

  • Versions: 8
  • Dependent Packages: 2
  • Dependent Repositories: 0
  • Downloads: 151 Total
Rankings
Stargazers count: 9.2%
Dependent repos count: 9.9%
Average: 13.0%
Forks count: 16.2%
Dependent packages count: 16.6%
Last synced: 5 months ago

Dependencies

.github/workflows/CompatHelper.yml actions
  • julia-actions/setup-julia v1 composite
.github/workflows/TagBot.yml actions
  • JuliaRegistries/TagBot v1 composite
.github/workflows/ci.yml actions
  • actions/checkout v2 composite
  • codecov/codecov-action v1 composite
  • julia-actions/cache v1 composite
  • julia-actions/julia-buildpkg v1 composite
  • julia-actions/julia-buildpkg latest composite
  • julia-actions/julia-docdeploy latest composite
  • julia-actions/julia-processcoverage v1 composite
  • julia-actions/julia-runtest v1 composite
  • julia-actions/setup-julia v1 composite