Constrained optimization
Constrained Bayesian Optimization¶
In this tutorial we demonstrate the use of Xopt to perform Bayesian Optimization on a simple test problem subject to a single constraint.
Define the test problem¶
Here we define a simple optimization problem, where we attempt to minimize the sin function in the domian [0,2*pi], subject to a cos constraining function.
from xopt.evaluator import Evaluator
from xopt.generators.bayesian import ExpectedImprovementGenerator
from xopt import Xopt
from xopt.vocs import VOCS
import time
import math
import numpy as np
# Ignore all warnings
import warnings
warnings.filterwarnings("ignore")
# define variables, function objective and constraining function
vocs = VOCS(
variables={"x": [0, 2 * math.pi]},
objectives={"f": "MINIMIZE"},
constraints={"c": ["LESS_THAN", 0]},
)
# define a test function to optimize
def test_function(input_dict):
return {"f": np.sin(input_dict["x"]), "c": np.cos(input_dict["x"])}
Create Xopt objects¶
Create the evaluator to evaluate our test function and create a generator that uses
the Expected Improvement acquisition function to perform Bayesian Optimization. Note that because we are optimizing a problem with no noise we set use_low_noise_prior=True
in the GP model constructor.
evaluator = Evaluator(function=test_function)
generator = ExpectedImprovementGenerator(vocs=vocs)
generator.gp_constructor.use_low_noise_prior = True
X = Xopt(evaluator=evaluator, generator=generator, vocs=vocs)
Generate and evaluate initial points¶
To begin optimization, we must generate some random initial data points. The first call
to X.step()
will generate and evaluate a number of randomly points specified by the
generator. Note that if we add data to xopt before calling X.step()
by assigning
the data to X.data
, calls to X.step()
will ignore the random generation and
proceed to generating points via Bayesian optimization.
# call X.random_evaluate(n_samples) to generate + evaluate initial points
X.random_evaluate(n_samples=2)
# inspect the gathered data
X.data
x | f | c | xopt_runtime | xopt_error | |
---|---|---|---|---|---|
0 | 4.660657 | -0.998662 | -0.051709 | 0.000013 | False |
1 | 5.802419 | -0.462458 | 0.886641 | 0.000004 | False |
Do bayesian optimization steps¶
To perform optimization we simply call X.step()
in a loop. This allows us to do
intermediate tasks in between optimization steps, such as examining the model and
acquisition function at each step (as we demonstrate here).
n_steps = 5
# test points for plotting
test_x = np.linspace(*X.vocs.bounds.flatten(), 50)
for i in range(n_steps):
start = time.perf_counter()
model = X.generator.train_model()
fig, ax = X.generator.visualize_model(n_grid=100)
print(time.perf_counter() - start)
# add ground truth functions to plots
out = test_function({"x": test_x})
ax[0, 0].plot(test_x, out["f"], "C0-.")
ax[1, 0].plot(test_x, out["c"], "C2-.")
# do the optimization step
X.step()
0.6926092300000164
0.2649869189999663
0.2840630099999544
0.26894787499986705
0.28564496500030145
# access the collected data
X.data
x | f | c | xopt_runtime | xopt_error | |
---|---|---|---|---|---|
0 | 4.660657 | -0.998662 | -0.051709 | 0.000013 | False |
1 | 5.802419 | -0.462458 | 0.886641 | 0.000004 | False |
2 | 4.102532 | -0.819730 | -0.572750 | 0.000010 | False |
3 | 0.000000 | 0.000000 | 1.000000 | 0.000010 | False |
4 | 2.140939 | 0.841824 | -0.539752 | 0.000011 | False |
5 | 4.657766 | -0.998509 | -0.054596 | 0.000010 | False |
6 | 4.650598 | -0.998092 | -0.061752 | 0.000028 | False |