using BayesianOptimization, GaussianProcesses, Distributions f(x) = sum((x .- 1).^2) + randn() # noisy function to minimize # Choose as a model an elastic GP with input dimensions 2. # The GP is called elastic, because data can be appended efficiently. model = ElasticGPE(2, # 2 input dimensions mean = MeanConst(0.), kernel = SEArd([0., 0.], 5.), logNoise = 0., capacity = 3000) # the initial capacity of the GP is 3000 samples. set_priors!(model.mean, [Normal(1, 2)]) # Optimize the hyperparameters of the GP using maximum a posteriori (MAP) estimates every 50 steps modeloptimizer = MAPGPOptimizer(every = 50, noisebounds = [-4, 3], # bounds of the logNoise kernbounds = [[-1, -1, 0], [4, 4, 10]], # bounds of the 3 parameters GaussianProcesses.get_param_names(model.kernel) maxeval = 40) opt = BOpt(f, model, UpperConfidenceBound(), # type of acquisition modeloptimizer, [-5., -5.], [5., 5.], # lowerbounds, upperbounds repetitions = 5, # evaluate the function for each input 5 times maxiterations = 100, # evaluate at 100 input positions sense = Min, # minimize the function acquisitionoptions = (method = :LD_LBFGS, # run optimization of acquisition function with NLopts :LD_LBFGS method restarts = 5, # run the NLopt method from 5 random initial conditions each time. maxtime = 0.1, # run the NLopt method for at most 0.1 second each time maxeval = 1000), # run the NLopt methods for at most 1000 iterations (for other options see https://github.com/JuliaOpt/NLopt.jl) verbosity = Progress) result = boptimize!(opt)
To continue the optimization, one can call
boptimize!(opt) multiple times.
result = boptimize!(opt) # first time (includes initialization) result = boptimize!(opt) # restart maxiterations!(opt, 50) # set maxiterations for the next call result = boptimize!(opt) # restart again
(Warm-)start with some known function values
By default, the first
5*length(lowerbounds) input points are sampled from a
Sobol sequence. If instead one has already some function values available and
wants to skip the initialization with the Sobol sequence, one can update the
model with the available data and set
initializer_iterations = 0. For example
(continuing the above example after setting the
x = [rand(2) for _ in 1:20] y = -f.(x) append!(model, hcat(x...), y) opt = BOpt(f, model, UpperConfidenceBound(), modeloptimizer, [-5., -5.], [5., 5.], maxiterations = 100, sense = Min, initializer_iterations = 0 ) result = boptimize!(opt)
This package exports
- acquisition types:
- scaling of standard deviation in
- GP hyperparameter optimizer:
- optimization sense:
- verbosity levels:
Use the REPL help, e.g.
?Bopt, to get more information.
Review papers on Bayesian optimization
- A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning
- Taking the Human Out of the Loop: A Review of Bayesian Optimization