Interpolated optimization
Bayesian Optimization w/ Interpolated Samples¶
In some situations, the process of evaluating objectives and constraints consumes fewer resources compared to the computational demands associated with Bayesian Optimization (BO) decision-making. Particularly, when making subtle changes to parameters during optimization, the cost of assessing objectives becomes notably more affordable.
Consider a practical example: the optimization of magnet parameters in an accelerator with the goal of either minimizing the beam spot size on a screen or maximizing the Free Electron Laser (FEL) pulse energy. In such cases, where adjustments to accelerator parameters are frequent, it proves beneficial to augment the dataset by implementing multiple smaller changes to the parameters. These adjustments are followed by quick measurements of the objective in between the parameter changes guided by BO.
This approach, although introducing a slight slowdown to the BO process due to the addition of extra training points, it will expedite convergence for most problems. It offers a more efficient alternative compared to the strategy of measuring the same points multiple times in noisy environments. The rationale here is that the exploration of a broader parameter space through numerous smaller changes enhances the overall understanding of the system's behavior, potentially leading to a more efficient and effective optimization process.
NOTE: This only works for serialized problems.
WARNING: The interpolated points may violate constraints! Do not use this feature in problems where safety is critical.
Define the test problem¶
Here we define a simple optimization problem, where we attempt to minimize the sphere function.
from xopt.vocs import VOCS
from xopt.evaluator import Evaluator
from xopt.generators.bayesian import ExpectedImprovementGenerator
from xopt import Xopt
# define variables and function objectives
vocs = VOCS(
variables={"x1": [-1, 1], "x2": [-1, 1]},
objectives={"f": "MINIMIZE"},
)
# define a test function to optimize
def sphere_function(input_dict):
return {"f": input_dict["x1"] ** 2 + input_dict["x2"] ** 2}
Create Xopt objects¶
Create the evaluator to evaluate our test function and create a generator that uses
the Upper Confidence Bound acquisition function to perform Bayesian Optimization. We
additionally specify n_interpolate_points
to be non-zero such that the generator
proposes interpolated points during generation.
# define a generator that uses 5 interpolation points during sampling
generator = ExpectedImprovementGenerator(vocs=vocs, n_interpolate_points=5)
generator.gp_constructor.use_low_noise_prior = True
evaluator = Evaluator(function=sphere_function)
X = Xopt(evaluator=evaluator, generator=generator, vocs=vocs)
Generate and evaluate initial points¶
To begin optimization, we must generate some random initial data points. The first call
to X.step()
will generate and evaluate a number of randomly points specified by the
generator. Note that if we add data to xopt before calling X.step()
by assigning
the data to X.data
, calls to X.step()
will ignore the random generation and
proceed to generating points via Bayesian optimization.
# call X.random_evaluate() to generate + evaluate initial points
X.random_evaluate(2)
# inspect the gathered data
X.data
x1 | x2 | f | xopt_runtime | xopt_error | |
---|---|---|---|---|---|
0 | -0.444992 | 0.052595 | 0.200784 | 0.000003 | False |
1 | -0.501742 | 0.270378 | 0.324849 | 0.000001 | False |
Do bayesian optimization steps¶
To perform optimization we simply call X.step()
in a loop. This allows us to do
intermediate tasks in between optimization steps, such as examining the model and
acquisition function at each step (as we demonstrate here).
X.generator.train_model()
X.generator.visualize_model(n_grid=50)
n_steps = 5
for i in range(n_steps):
print(i)
# do the optimization step
X.step()
# train the model and visualize
X.generator.train_model()
fig, ax = X.generator.visualize_model(n_grid=50)
# add the ground truth minimum location
for a in ax.flatten()[:-1]:
a.plot(0, 0, "x", c="red", ms=10)
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
0
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
1
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
2
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
3
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
4
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
/home/runner/miniconda3/envs/xopt-dev/lib/python3.13/site-packages/botorch/acquisition/analytic.py:332: NumericsWarning: ExpectedImprovement has known numerical issues that lead to suboptimal optimization performance. It is strongly recommended to simply replace ExpectedImprovement --> LogExpectedImprovement instead, which fixes the issues and has the same API. See https://arxiv.org/abs/2310.20708 for details. legacy_ei_numerics_warning(legacy_name=type(self).__name__)
# access the collected data
X.data
x1 | x2 | f | xopt_runtime | xopt_error | |
---|---|---|---|---|---|
0 | -0.444992 | 0.052595 | 0.200784 | 2.905000e-06 | False |
1 | -0.501742 | 0.270378 | 0.324849 | 1.443000e-06 | False |
2 | -0.483812 | 0.201613 | 0.274722 | 3.136000e-06 | False |
3 | -0.465882 | 0.132849 | 0.234695 | 9.910000e-07 | False |
4 | -0.447952 | 0.064084 | 0.204768 | 7.810000e-07 | False |
5 | -0.430022 | -0.004680 | 0.184941 | 8.220001e-07 | False |
6 | -0.412092 | -0.073445 | 0.175214 | 7.010000e-07 | False |
7 | -0.473526 | -0.079325 | 0.230519 | 3.236000e-06 | False |
8 | -0.534960 | -0.085204 | 0.293442 | 1.122000e-06 | False |
9 | -0.596394 | -0.091084 | 0.363982 | 8.610001e-07 | False |
10 | -0.657828 | -0.096964 | 0.442139 | 8.520000e-07 | False |
11 | -0.719261 | -0.102843 | 0.527914 | 8.119999e-07 | False |
12 | -0.638458 | -0.083818 | 0.414654 | 3.427000e-06 | False |
13 | -0.557654 | -0.064792 | 0.315176 | 1.072000e-06 | False |
14 | -0.476850 | -0.045766 | 0.229480 | 7.120000e-07 | False |
15 | -0.396046 | -0.026740 | 0.157568 | 7.510000e-07 | False |
16 | -0.315242 | -0.007715 | 0.099437 | 8.109999e-07 | False |
17 | -0.293416 | -0.006257 | 0.086132 | 3.757000e-06 | False |
18 | -0.271590 | -0.004799 | 0.073784 | 9.710000e-07 | False |
19 | -0.249764 | -0.003341 | 0.062393 | 8.219999e-07 | False |
20 | -0.227938 | -0.001884 | 0.051959 | 8.210000e-07 | False |
21 | -0.206112 | -0.000426 | 0.042482 | 6.920000e-07 | False |
22 | -0.183647 | -0.003101 | 0.033736 | 3.256000e-06 | False |
23 | -0.161182 | -0.005775 | 0.026013 | 8.120001e-07 | False |
24 | -0.138718 | -0.008450 | 0.019314 | 7.620000e-07 | False |
25 | -0.116253 | -0.011124 | 0.013638 | 1.072000e-06 | False |
26 | -0.093788 | -0.013799 | 0.008987 | 8.109999e-07 | False |
Getting the optimization result¶
To get the best point (without evaluating it) we ask the generator to predict the optimum based on the posterior mean.
X.generator.get_optimum()
x1 | x2 | |
---|---|---|
0 | -0.033571 | -0.020375 |