Skip to content

Bayesian generators

xopt.generators.bayesian.bayesian_generator.BayesianGenerator

BayesianGenerator(**kwargs)

Bases: Generator, ABC

Bayesian Generator for Bayesian Optimization.

Attributes:

name : str The name of the Bayesian Generator.

model : Optional[Model] The BoTorch model used by the generator to perform optimization.

n_monte_carlo_samples : int The number of Monte Carlo samples to use in the optimization process.

turbo_controller : SerializeAsAny[Optional[TurboController]] The Turbo Controller for trust-region Bayesian Optimization.

use_cuda : bool A flag to enable or disable CUDA usage if available.

gp_constructor : SerializeAsAny[ModelConstructor] The constructor used to generate the model for Bayesian Optimization.

numerical_optimizer : SerializeAsAny[NumericalOptimizer] The optimizer used to optimize the acquisition function in Bayesian Optimization.

max_travel_distances : Optional[List[float]] The limits for travel distances between points in normalized space.

fixed_features : Optional[Dict[str, float]] The fixed features used in Bayesian Optimization.

computation_time : Optional[pd.DataFrame] A data frame tracking computation time in seconds.

n_interpolate_samples: Optional[PositiveInt] Number of interpolation points to generate between last observation and next observation, requires n_candidates to be 1.

n_candidates : int The number of candidates to generate in each optimization step.

Methods:

generate(self, n_candidates: int) -> List[Dict]: Generate candidates for Bayesian Optimization.

add_data(self, new_data: pd.DataFrame): Add new data to the generator for Bayesian Optimization.

train_model(self, data: pd.DataFrame = None, update_internal=True) -> Module: Train a Bayesian model for Bayesian Optimization.

propose_candidates(self, model: Module, n_candidates: int = 1) -> Tensor: Propose candidates for Bayesian Optimization.

get_input_data(self, data: pd.DataFrame) -> torch.Tensor: Get input data in torch.Tensor format.

get_acquisition(self, model: Module) -> AcquisitionFunction: Get the acquisition function for Bayesian Optimization.

Source code in xopt/generator.py
102
103
104
105
106
107
108
def __init__(self, **kwargs):
    """
    Initialize the generator.

    """
    super().__init__(**kwargs)
    logger.info(f"Initialized generator {self.name}")

Attributes

xopt.generators.bayesian.bayesian_generator.BayesianGenerator.model_input_names property
model_input_names

variable names corresponding to trained model

Functions

xopt.generators.bayesian.bayesian_generator.BayesianGenerator.add_data
add_data(new_data)

Add new data to the generator for Bayesian Optimization.

Parameters:

new_data : pd.DataFrame The new data to be added to the generator.

Notes:

This method appends the new data to the existing data in the generator.

Source code in xopt/generators/bayesian/bayesian_generator.py
246
247
248
249
250
251
252
253
254
255
256
257
258
259
def add_data(self, new_data: pd.DataFrame):
    """
    Add new data to the generator for Bayesian Optimization.

    Parameters:
    -----------
    new_data : pd.DataFrame
        The new data to be added to the generator.

    Notes:
    ------
    This method appends the new data to the existing data in the generator.
    """
    self.data = pd.concat([self.data, new_data], axis=0)
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.generate
generate(n_candidates)

Generate candidates using Bayesian Optimization.

Parameters:

n_candidates : int The number of candidates to generate in each optimization step.

Returns:

List[Dict] A list of dictionaries containing the generated candidates.

Raises:

NotImplementedError If the number of candidates is greater than 1, and the generator does not support batch candidate generation.

RuntimeError If no data is contained in the generator, the 'add_data' method should be called to add data before generating candidates.

Notes:

This method generates candidates for Bayesian Optimization based on the provided number of candidates. It updates the internal model with the current data and calculates the candidates by optimizing the acquisition function. The method returns the generated candidates in the form of a list of dictionaries.

Source code in xopt/generators/bayesian/bayesian_generator.py
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
def generate(self, n_candidates: int) -> list[dict]:
    """
    Generate candidates using Bayesian Optimization.

    Parameters:
    -----------
    n_candidates : int
        The number of candidates to generate in each optimization step.

    Returns:
    --------
    List[Dict]
        A list of dictionaries containing the generated candidates.

    Raises:
    -------
    NotImplementedError
        If the number of candidates is greater than 1, and the generator does not
        support batch candidate generation.

    RuntimeError
        If no data is contained in the generator, the 'add_data' method should be
        called to add data before generating candidates.

    Notes:
    ------
    This method generates candidates for Bayesian Optimization based on the
    provided number of candidates. It updates the internal model with the current
    data and calculates the candidates by optimizing the acquisition function.
    The method returns the generated candidates in the form of a list of dictionaries.
    """

    self.n_candidates = n_candidates
    if n_candidates > 1 and not self.supports_batch_generation:
        raise NotImplementedError(
            "This Bayesian algorithm does not currently support parallel candidate "
            "generation"
        )

    # if no data exists raise error
    if self.data is None:
        raise RuntimeError(
            "no data contained in generator, call `add_data` "
            "method to add data, see also `Xopt.random_evaluate()`"
        )

    else:
        # dict to track runtimes
        timing_results = {}

        # update internal model with internal data
        start_time = time.perf_counter()
        model = self.train_model(self.get_training_data(self.data))
        timing_results["training"] = time.perf_counter() - start_time

        # propose candidates given model
        start_time = time.perf_counter()
        candidates = self.propose_candidates(model, n_candidates=n_candidates)
        timing_results["acquisition_optimization"] = (
            time.perf_counter() - start_time
        )

        # post process candidates
        result = self._process_candidates(candidates)

        # append timing results to dataframe (if it exists)
        if self.computation_time is not None:
            self.computation_time = pd.concat(
                (
                    self.computation_time,
                    pd.DataFrame(timing_results, index=[0]),
                ),
                ignore_index=True,
            )
        else:
            self.computation_time = pd.DataFrame(timing_results, index=[0])

        if self.n_interpolate_points is not None:
            if self.n_candidates > 1:
                raise RuntimeError(
                    "cannot generate interpolated points for "
                    "multiple candidate generation"
                )
            else:
                assert len(result) == 1
                result = interpolate_points(
                    pd.concat(
                        (self.data.iloc[-1:][self.vocs.variable_names], result),
                        axis=0,
                        ignore_index=True,
                    ),
                    num_points=self.n_interpolate_points,
                )

        return result.to_dict("records")
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.get_acquisition
get_acquisition(model)

Define the acquisition function based on the given GP model.

Parameters:

model : Module The BoTorch model to be used for generating the acquisition function.

Returns:

acqusition_function : AcquisitionFunction

Raises:

ValueError If the provided 'model' is None. A valid model is required to create the acquisition function.

Source code in xopt/generators/bayesian/bayesian_generator.py
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
def get_acquisition(self, model: Module) -> AcquisitionFunction:
    """
    Define the acquisition function based on the given GP model.

    Parameters:
    -----------
    model : Module
        The BoTorch model to be used for generating the acquisition function.

    Returns:
    --------
    acqusition_function : AcquisitionFunction

    Raises:
    -------
    ValueError
        If the provided 'model' is None. A valid model is required to create the
        acquisition function.
    """
    if model is None:
        raise ValueError("model cannot be None")

    # get base acquisition function
    acq = self._get_acquisition(model)

    # apply constraints if specified in vocs
    # TODO: replace with direct constrainted acquisition function calls
    # see SampleReducingMCAcquisitionFunction in botorch for rationale
    if len(self.vocs.constraints):
        try:
            sampler = acq.sampler
        except AttributeError:
            sampler = self._get_sampler(model)

        acq = ConstrainedMCAcquisitionFunction(
            model, acq, self._get_constraint_callables(), sampler=sampler
        )

        # log transform the result to handle the constraints
        acq = LogAcquisitionFunction(acq)

    # apply fixed features if specified in the generator
    if self.fixed_features is not None:
        # get input dim
        dim = len(self.model_input_names)
        columns = []
        values = []
        for name, value in self.fixed_features.items():
            columns += [self.model_input_names.index(name)]
            values += [value]

        acq = FixedFeatureAcquisitionFunction(
            acq_function=acq, d=dim, columns=columns, values=values
        )

    return acq
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.get_input_data
get_input_data(data)

Convert input data to a torch tensor.

Parameters:

data : pd.DataFrame The input data in the form of a pandas DataFrame.

Returns:

torch.Tensor A torch tensor containing the input data.

Notes:

This method takes a pandas DataFrame as input data and converts it into a torch tensor. It specifically selects columns corresponding to the model's input names (variables), and the resulting tensor is configured with the data type and device settings from the generator.

Source code in xopt/generators/bayesian/bayesian_generator.py
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
def get_input_data(self, data: pd.DataFrame) -> torch.Tensor:
    """
    Convert input data to a torch tensor.

    Parameters:
    -----------
    data : pd.DataFrame
        The input data in the form of a pandas DataFrame.

    Returns:
    --------
    torch.Tensor
        A torch tensor containing the input data.

    Notes:
    ------
    This method takes a pandas DataFrame as input data and converts it into a
    torch tensor. It specifically selects columns corresponding to the model's
    input names (variables), and the resulting tensor is configured with the data
    type and device settings from the generator.
    """
    return torch.tensor(data[self.model_input_names].to_numpy(), **self.tkwargs)
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.get_optimum
get_optimum()

select the best point(s) given by the model using the Posterior mean

Source code in xopt/generators/bayesian/bayesian_generator.py
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
def get_optimum(self):
    """select the best point(s) given by the
    model using the Posterior mean"""
    c_posterior_mean = ConstrainedMCAcquisitionFunction(
        self.model,
        qUpperConfidenceBound(
            model=self.model, beta=0.0, objective=self._get_objective()
        ),
        self._get_constraint_callables(),
    )

    result = self.numerical_optimizer.optimize(
        c_posterior_mean, self._get_bounds(), 1
    )

    return self._process_candidates(result)
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.get_training_data
get_training_data(data)

Get training data used to train the GP model.

If a turbo controller is specified with the flag restrict_model_data this will return a subset of data that is inside the trust region.

Parameters:

data : pd.DataFrame The data in the form of a pandas DataFrame.

Returns:

data : pd.DataFrame A subset of data used to train the model form of a pandas DataFrame.

Source code in xopt/generators/bayesian/bayesian_generator.py
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
def get_training_data(self, data: pd.DataFrame) -> pd.DataFrame:
    """
    Get training data used to train the GP model.

    If a turbo controller is specified with the flag `restrict_model_data` this
    will return a subset of data that is inside the trust region.

    Parameters:
    -----------
    data : pd.DataFrame
        The data in the form of a pandas DataFrame.

    Returns:
    --------
    data : pd.DataFrame
        A subset of data used to train the model form of a pandas DataFrame.

    """
    if self.turbo_controller is not None:
        if self.turbo_controller.restrict_model_data:
            data = self.turbo_controller.get_data_in_trust_region(data, self)

    return data
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.propose_candidates
propose_candidates(model, n_candidates=1)

Propose candidates using Bayesian Optimization.

Parameters:

model : Module The trained Bayesian model. n_candidates : int, optional The number of candidates to propose (default is 1).

Returns:

Tensor A tensor containing the proposed candidates.

Notes:

This method proposes candidates for Bayesian Optimization by numerically optimizing the acquisition function using the trained model. It updates the state of the Turbo controller if used and calculates the optimization bounds.

Source code in xopt/generators/bayesian/bayesian_generator.py
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
def propose_candidates(self, model: Module, n_candidates: int = 1) -> Tensor:
    """
    Propose candidates using Bayesian Optimization.

    Parameters:
    -----------
    model : Module
        The trained Bayesian model.
    n_candidates : int, optional
        The number of candidates to propose (default is 1).

    Returns:
    --------
    Tensor
        A tensor containing the proposed candidates.

    Notes:
    ------
    This method proposes candidates for Bayesian Optimization by numerically
    optimizing the acquisition function using the trained model. It updates the
    state of the Turbo controller if used and calculates the optimization bounds.
    """
    # update TurBO state if used with the last `n_candidates` points
    if self.turbo_controller is not None:
        self.turbo_controller.update_state(self, n_candidates)

    # calculate optimization bounds
    bounds = self._get_optimization_bounds()

    # get acquisition function
    acq_funct = self.get_acquisition(model)

    # get initial candidates to start acquisition function optimization
    initial_points = self._get_initial_conditions(n_candidates)

    # get candidates -- grid optimizer does not support batch_initial_conditions
    if isinstance(self.numerical_optimizer, GridOptimizer):
        candidates = self.numerical_optimizer.optimize(
            acq_funct, bounds, n_candidates
        )
    else:
        candidates = self.numerical_optimizer.optimize(
            acq_funct, bounds, n_candidates, batch_initial_conditions=initial_points
        )
    return candidates
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.train_model
train_model(data=None, update_internal=True)

Train a Bayesian model for Bayesian Optimization.

Parameters:

data : pd.DataFrame, optional The data to be used for training the model. If not provided, the internal data of the generator is used. update_internal : bool, optional Flag to indicate whether to update the internal model of the generator with the trained model (default is True).

Returns:

Module The trained Bayesian model.

Raises:

ValueError If no data is available to build the model.

Notes:

This method trains a Bayesian model using the provided data or the internal data of the generator. It updates the internal model with the trained model if the 'update_internal' flag is set to True.

Source code in xopt/generators/bayesian/bayesian_generator.py
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
def train_model(self, data: pd.DataFrame = None, update_internal=True) -> Module:
    """
    Train a Bayesian model for Bayesian Optimization.

    Parameters:
    -----------
    data : pd.DataFrame, optional
        The data to be used for training the model. If not provided, the internal
        data of the generator is used.
    update_internal : bool, optional
        Flag to indicate whether to update the internal model of the generator
        with the trained model (default is True).

    Returns:
    --------
    Module
        The trained Bayesian model.

    Raises:
    -------
    ValueError
        If no data is available to build the model.

    Notes:
    ------
    This method trains a Bayesian model using the provided data or the internal
    data of the generator. It updates the internal model with the trained model
    if the 'update_internal' flag is set to True.
    """
    if data is None:
        data = self.get_training_data(self.data)
    if data.empty:
        raise ValueError("no data available to build model")

    # get input bounds
    variable_bounds = deepcopy(self.vocs.variables)

    # if turbo restrict points is true then set the bounds to the trust region
    # bounds
    if self.turbo_controller is not None:
        if self.turbo_controller.restrict_model_data:
            variable_bounds = dict(
                zip(
                    self.vocs.variable_names,
                    self.turbo_controller.get_trust_region(self).numpy().T,
                )
            )

    # add fixed feature bounds if requested
    if self.fixed_features is not None:
        # get bounds for each fixed_feature (vocs bounds take precedent)
        for key in self.fixed_features:
            if key not in variable_bounds:
                if key not in data:
                    raise KeyError(
                        "generator data needs to contain fixed feature "
                        f"column name `{key}`"
                    )
                f_data = data[key]
                bounds = [f_data.min(), f_data.max()]
                if bounds[1] - bounds[0] < 1e-8:
                    bounds[1] = bounds[0] + 1e-8
                variable_bounds[key] = bounds

    _model = self.gp_constructor.build_model(
        self.model_input_names,
        self.vocs.output_names,
        data,
        {name: variable_bounds[name] for name in self.model_input_names},
        **self.tkwargs,
    )

    if update_internal:
        self.model = _model
    return _model
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.validate_turbo_controller
validate_turbo_controller(value, info)

note default behavior is no use of turbo

Source code in xopt/generators/bayesian/bayesian_generator.py
226
227
228
229
230
231
232
233
234
235
236
237
@field_validator("turbo_controller", mode="before")
def validate_turbo_controller(cls, value, info: ValidationInfo):
    """note default behavior is no use of turbo"""
    if value is None:
        return value

    if cls._compatible_turbo_controllers.default is None:
        raise ValueError("cannot use any turbo controller with this generator")
    else:
        return validate_turbo_controller_base(
            value, cls._compatible_turbo_controllers.default, info
        )
xopt.generators.bayesian.bayesian_generator.BayesianGenerator.visualize_model
visualize_model(**kwargs)

Display GP model predictions for the selected output(s).

The GP models are displayed with respect to the named variables. If None are given, the list of variables in vocs is used. Feasible samples are indicated with a filled orange "o", infeasible samples with a hollow red "o". Feasibility is calculated with respect to all constraints unless the selected output is a constraint itself, in which case only that one is considered.

Parameters:

Name Type Description Default
**kwargs

Supported keyword arguments: - output_names : List[str] Outputs for which the GP models are displayed. Defaults to all outputs in vocs. - variable_names : List[str] The variables with respect to which the GP models are displayed (maximum of 2). Defaults to vocs.variable_names. - idx : int Index of the last sample to use. This also selects the point of reference in higher dimensions unless an explicit reference_point is given. - reference_point : dict Reference point determining the value of variables in vocs.variable_names, but not in variable_names (slice plots in higher dimensions). Defaults to last used sample. - show_samples : bool, optional Whether samples are shown. - show_prior_mean : bool, optional Whether the prior mean is shown. - show_feasibility : bool, optional Whether the feasibility region is shown. - show_acquisition : bool, optional Whether the acquisition function is computed and shown (only if acquisition function is not None). - n_grid : int, optional Number of grid points per dimension used to display the model predictions. - axes : Axes, optional Axes object used for plotting. - exponentiate : bool, optional Flag to exponentiate acquisition function before plotting.

{}

Returns:

Name Type Description
result tuple

The matplotlib figure and axes objects.

Source code in xopt/generators/bayesian/bayesian_generator.py
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
def visualize_model(self, **kwargs):
    """Display GP model predictions for the selected output(s).

    The GP models are displayed with respect to the named variables. If None are given, the list of variables in
    vocs is used. Feasible samples are indicated with a filled orange "o", infeasible samples with a hollow
    red "o". Feasibility is calculated with respect to all constraints unless the selected output is a
    constraint itself, in which case only that one is considered.

    Parameters
    ----------
    **kwargs: dict, optional
        Supported keyword arguments:
        - output_names : List[str]
            Outputs for which the GP models are displayed. Defaults to all outputs in vocs.
        - variable_names : List[str]
            The variables with respect to which the GP models are displayed (maximum of 2).
            Defaults to vocs.variable_names.
        - idx : int
            Index of the last sample to use. This also selects the point of reference in
            higher dimensions unless an explicit reference_point is given.
        - reference_point : dict
            Reference point determining the value of variables in vocs.variable_names, but not in variable_names
            (slice plots in higher dimensions). Defaults to last used sample.
        - show_samples : bool, optional
            Whether samples are shown.
        - show_prior_mean : bool, optional
            Whether the prior mean is shown.
        - show_feasibility : bool, optional
            Whether the feasibility region is shown.
        - show_acquisition : bool, optional
            Whether the acquisition function is computed and shown (only if acquisition function is not None).
        - n_grid : int, optional
            Number of grid points per dimension used to display the model predictions.
        - axes : Axes, optional
            Axes object used for plotting.
        - exponentiate : bool, optional
            Flag to exponentiate acquisition function before plotting.

    Returns
    -------
    result : tuple
        The matplotlib figure and axes objects.
    """
    return visualize_generator_model(self, **kwargs)

xopt.generators.bayesian.bayesian_exploration.BayesianExplorationGenerator

BayesianExplorationGenerator(**kwargs)

Bases: BayesianGenerator

Bayesian exploration generator for autonomous characterization.

Source code in xopt/generators/bayesian/bayesian_exploration.py
30
31
32
33
34
35
def __init__(self, **kwargs) -> None:
    super().__init__(**kwargs)
    if self.vocs.n_observables == 0:
        raise ValueError(
            "BayesianExplorationGenerator requires at least one observable in the vocs (instead of specifying an objective)."
        )

xopt.generators.bayesian.expected_improvement.ExpectedImprovementGenerator

ExpectedImprovementGenerator(**kwargs)

Bases: BayesianGenerator

Bayesian optimization generator using Log Expected Improvement.

Source code in xopt/generator.py
102
103
104
105
106
107
108
def __init__(self, **kwargs):
    """
    Initialize the generator.

    """
    super().__init__(**kwargs)
    logger.info(f"Initialized generator {self.name}")

Functions

xopt.generators.bayesian.expected_improvement.ExpectedImprovementGenerator.get_acquisition
get_acquisition(model)

Returns a function that can be used to evaluate the acquisition function. Overwrites base get_acquisition method.

Parameters:

model : Model The model used for Bayesian Optimization.

Returns:

acq : AcquisitionFunction The acquisition function.

Source code in xopt/generators/bayesian/expected_improvement.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
def get_acquisition(self, model):
    """
    Returns a function that can be used to evaluate the acquisition function.
    Overwrites base `get_acquisition` method.

    Parameters:
    -----------
    model : Model
        The model used for Bayesian Optimization.

    Returns:
    --------
    acq : AcquisitionFunction
        The acquisition function.
    """
    if model is None:
        raise ValueError("model cannot be None")

    # get base acquisition function
    acq = self._get_acquisition(model)

    # apply fixed features if specified in the generator
    if self.fixed_features is not None:
        # get input dim
        dim = len(self.model_input_names)
        columns = []
        values = []
        for name, value in self.fixed_features.items():
            columns += [self.model_input_names.index(name)]
            values += [value]

        acq = FixedFeatureAcquisitionFunction(
            acq_function=acq, d=dim, columns=columns, values=values
        )

    return acq

xopt.generators.bayesian.expected_improvement.TDExpectedImprovementGenerator

TDExpectedImprovementGenerator(**kwargs)

Bases: TimeDependentBayesianGenerator, ExpectedImprovementGenerator

Source code in xopt/generator.py
102
103
104
105
106
107
108
def __init__(self, **kwargs):
    """
    Initialize the generator.

    """
    super().__init__(**kwargs)
    logger.info(f"Initialized generator {self.name}")

xopt.generators.bayesian.mobo.MOBOGenerator

MOBOGenerator(**kwargs)

Bases: MultiObjectiveBayesianGenerator

Implements Multi-Objective Bayesian Optimization using the Log Expected Hypervolume Improvement acquisition function.

Attributes:

name : str The name of the generator. supports_batch_generation : bool Indicates if the generator supports batch candidate generation. use_pf_as_initial_points : bool Flag to specify if Pareto front points are to be used during optimization of the acquisition function.

Methods:

_get_objective(self) -> Callable Create the multi-objective Bayesian optimization objective. get_acquisition(self, model: torch.nn.Module) -> FixedFeatureAcquisitionFunction Get the acquisition function for Bayesian Optimization. _get_acquisition(self, model: torch.nn.Module) -> qLogNoisyExpectedHypervolumeImprovement Create the Log Expected Hypervolume Improvement acquisition function. _get_initial_conditions(self, n_candidates: int = 1) -> Optional[Tensor] Generate initial candidates for optimizing the acquisition function based on the Pareto front.

Source code in xopt/generator.py
102
103
104
105
106
107
108
def __init__(self, **kwargs):
    """
    Initialize the generator.

    """
    super().__init__(**kwargs)
    logger.info(f"Initialized generator {self.name}")

Functions

xopt.generators.bayesian.mobo.MOBOGenerator.get_acquisition
get_acquisition(model)

Get the acquisition function for Bayesian Optimization.

Parameters:

model : torch.nn.Module The model used for Bayesian Optimization.

Returns:

FixedFeatureAcquisitionFunction The acquisition function.

Source code in xopt/generators/bayesian/mobo.py
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
def get_acquisition(
    self, model: torch.nn.Module
) -> FixedFeatureAcquisitionFunction:
    """
    Get the acquisition function for Bayesian Optimization.

    Parameters:
    -----------
    model : torch.nn.Module
        The model used for Bayesian Optimization.

    Returns:
    --------
    FixedFeatureAcquisitionFunction
        The acquisition function.
    """
    if model is None:
        raise ValueError("model cannot be None")

    # get base acquisition function
    acq = self._get_acquisition(model)

    # apply fixed features if specified in the generator
    if self.fixed_features is not None:
        # get input dim
        dim = len(self.model_input_names)
        columns = []
        values = []
        for name, value in self.fixed_features.items():
            columns += [self.model_input_names.index(name)]
            values += [value]

        acq = FixedFeatureAcquisitionFunction(
            acq_function=acq, d=dim, columns=columns, values=values
        )

    return acq

xopt.generators.bayesian.upper_confidence_bound.UpperConfidenceBoundGenerator

UpperConfidenceBoundGenerator(**kwargs)

Bases: BayesianGenerator

Source code in xopt/generators/bayesian/upper_confidence_bound.py
41
42
43
44
45
46
47
48
def __init__(self, **kwargs):
    super().__init__(**kwargs)
    if self.vocs.n_constraints > 0:
        warnings.warn(
            "Using upper confidence bound with constraints may lead to invalid values "
            "if the base acquisition function has negative values. Use with "
            "caution."
        )

xopt.generators.bayesian.upper_confidence_bound.TDUpperConfidenceBoundGenerator

TDUpperConfidenceBoundGenerator(**kwargs)

Bases: TimeDependentBayesianGenerator, UpperConfidenceBoundGenerator

Source code in xopt/generators/bayesian/upper_confidence_bound.py
41
42
43
44
45
46
47
48
def __init__(self, **kwargs):
    super().__init__(**kwargs)
    if self.vocs.n_constraints > 0:
        warnings.warn(
            "Using upper confidence bound with constraints may lead to invalid values "
            "if the base acquisition function has negative values. Use with "
            "caution."
        )

xopt.generators.bayesian.multi_fidelity.MultiFidelityGenerator

MultiFidelityGenerator(**kwargs)

Bases: MOBOGenerator

Implements Multi-fidelity Bayesian optimization.

Attributes:

name : str The name of the generator. fidelity_parameter : Literal["s"] The fidelity parameter name. cost_function : Callable Callable function that describes the cost of evaluating the objective function. reference_point : Optional[Dict[str, float]] The reference point for multi-objective optimization. supports_multi_objective : bool Indicates if the generator supports multi-objective optimization. supports_batch_generation : bool Indicates if the generator supports batch candidate generation.

Methods:

validate_vocs(cls, v: VOCS) -> VOCS Validate the VOCS for the generator. calculate_total_cost(self, data: pd.DataFrame = None) -> float Calculate the total cost of data samples using the fidelity parameter. get_acquisition(self, model: torch.nn.Module) -> NMOMF Get the acquisition function for Bayesian Optimization. _get_acquisition(self, model: torch.nn.Module) -> NMOMF Create the Multi-Fidelity Knowledge Gradient acquisition function. add_data(self, new_data: pd.DataFrame) Add new data to the generator. fidelity_variable_index(self) -> int Get the index of the fidelity variable. fidelity_objective_index(self) -> int Get the index of the fidelity objective. get_optimum(self) -> pd.DataFrame Select the best point at the maximum fidelity.

Source code in xopt/generators/bayesian/multi_fidelity.py
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
def __init__(self, **kwargs):
    reference_point = kwargs.pop("reference_point", None)
    vocs = kwargs.get("vocs")
    # set reference point
    if reference_point is None:
        reference_point = {}
        for name, val in vocs.objectives.items():
            if name != "s":
                if val == "MAXIMIZE":
                    reference_point.update({name: -100.0})
                else:
                    reference_point.update({name: 100.0})

    reference_point.update({"s": 0.0})

    super(MultiFidelityGenerator, self).__init__(
        **kwargs, reference_point=reference_point
    )

Attributes

xopt.generators.bayesian.multi_fidelity.MultiFidelityGenerator.fidelity_objective_index property
fidelity_objective_index

Get the index of the fidelity objective.

Returns:

int The index of the fidelity objective.

xopt.generators.bayesian.multi_fidelity.MultiFidelityGenerator.fidelity_variable_index property
fidelity_variable_index

Get the index of the fidelity variable.

Returns:

int The index of the fidelity variable.

Functions

xopt.generators.bayesian.multi_fidelity.MultiFidelityGenerator.add_data
add_data(new_data)

Add new data to the generator.

Parameters:

new_data : pd.DataFrame The new data to be added.

Raises:

ValueError If the fidelity parameter is not in the new data or if the fidelity values are outside the range [0,1].

Source code in xopt/generators/bayesian/multi_fidelity.py
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
def add_data(self, new_data: pd.DataFrame):
    """
    Add new data to the generator.

    Parameters:
    -----------
    new_data : pd.DataFrame
        The new data to be added.

    Raises:
    -------
    ValueError
        If the fidelity parameter is not in the new data or if the fidelity
        values are outside the range [0,1].
    """
    if self.fidelity_parameter not in new_data:
        raise ValueError(
            f"fidelity parameter {self.fidelity_parameter} must be in added data"
        )

    # overwrite add data to check for valid fidelity values
    if (new_data[self.fidelity_parameter] > 1.0).any() or (
        new_data[self.fidelity_parameter] < 0.0
    ).any():
        raise ValueError("cannot add fidelity data that is outside the range [0,1]")
    super().add_data(new_data)
xopt.generators.bayesian.multi_fidelity.MultiFidelityGenerator.calculate_total_cost
calculate_total_cost(data=None)

Calculate the total cost of data samples using the fidelity parameter.

Parameters:

data : pd.DataFrame, optional The data samples, by default None.

Returns:

float The total cost of the data samples.

Source code in xopt/generators/bayesian/multi_fidelity.py
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
def calculate_total_cost(self, data: pd.DataFrame = None) -> float:
    """
    Calculate the total cost of data samples using the fidelity parameter.

    Parameters:
    -----------
    data : pd.DataFrame, optional
        The data samples, by default None.

    Returns:
    --------
    float
        The total cost of the data samples.
    """
    if data is None:
        data = self.data

    f_data = self.get_input_data(data)

    # apply callable function to get costs
    return self.cost_function(f_data[..., self.fidelity_variable_index]).sum()
xopt.generators.bayesian.multi_fidelity.MultiFidelityGenerator.get_acquisition
get_acquisition(model)

Get the acquisition function for Bayesian Optimization.

Parameters:

model : torch.nn.Module The model used for Bayesian Optimization.

Returns:

NMOMF The acquisition function.

Source code in xopt/generators/bayesian/multi_fidelity.py
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
def get_acquisition(self, model: torch.nn.Module) -> NMOMF:
    """
    Get the acquisition function for Bayesian Optimization.

    Parameters:
    -----------
    model : torch.nn.Module
        The model used for Bayesian Optimization.

    Returns:
    --------
    NMOMF
        The acquisition function.
    """
    if model is None:
        raise ValueError("model cannot be None")

    # get base acquisition function
    acq = self._get_acquisition(model)
    return acq
xopt.generators.bayesian.multi_fidelity.MultiFidelityGenerator.get_optimum
get_optimum()

Select the best point at the maximum fidelity.

Returns:

pd.DataFrame The best point at the maximum fidelity.

Source code in xopt/generators/bayesian/multi_fidelity.py
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
def get_optimum(self) -> pd.DataFrame:
    """
    Select the best point at the maximum fidelity.

    Returns:
    --------
    pd.DataFrame
        The best point at the maximum fidelity.
    """
    # define single objective based on vocs
    weights = torch.zeros(self.vocs.n_outputs, **self.tkwargs)
    for idx, ele in enumerate(self.vocs.objective_names):
        if self.vocs.objectives[ele] == "MINIMIZE":
            weights[idx] = -1.0
        elif self.vocs.objectives[ele] == "MAXIMIZE":
            weights[idx] = 1.0

    def obj_callable(
        Z: torch.Tensor, X: Optional[torch.Tensor] = None
    ) -> torch.Tensor:
        return torch.matmul(Z, weights.reshape(-1, 1)).squeeze(-1)

    c_posterior_mean = ConstrainedMCAcquisitionFunction(
        self.model,
        qUpperConfidenceBound(
            model=self.model, beta=0.0, objective=GenericMCObjective(obj_callable)
        ),
        self._get_constraint_callables(),
    )

    max_fidelity_c_posterior_mean = FixedFeatureAcquisitionFunction(
        c_posterior_mean,
        self.vocs.n_variables,
        [self.fidelity_variable_index],
        [1.0],
    )

    boundst = self._get_bounds().T
    fixed_bounds = torch.cat(
        (
            boundst[: self.fidelity_variable_index],
            boundst[self.fidelity_variable_index + 1 :],
        )
    ).T

    result = self.numerical_optimizer.optimize(
        max_fidelity_c_posterior_mean, fixed_bounds, 1
    )

    vnames = deepcopy(self.vocs.variable_names)
    del vnames[self.fidelity_variable_index]
    df = pd.DataFrame(result.detach().cpu().numpy(), columns=vnames)
    df[self.fidelity_parameter] = 1.0

    return self.vocs.convert_dataframe_to_inputs(df)
xopt.generators.bayesian.multi_fidelity.MultiFidelityGenerator.validate_vocs
validate_vocs(v)

Validate the VOCS for the generator.

Parameters:

v : VOCS The VOCS to be validated.

Returns:

VOCS The validated VOCS.

Raises:

ValueError If constraints are present in the VOCS.

Source code in xopt/generators/bayesian/multi_fidelity.py
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
@field_validator("vocs", mode="before")
def validate_vocs(cls, v: VOCS) -> VOCS:
    """
    Validate the VOCS for the generator.

    Parameters:
    -----------
    v : VOCS
        The VOCS to be validated.

    Returns:
    --------
    VOCS
        The validated VOCS.

    Raises:
    -------
    ValueError
        If constraints are present in the VOCS.
    """
    v.variables["s"] = [0, 1]
    v.objectives["s"] = ObjectiveEnum("MAXIMIZE")
    if len(v.constraints):
        raise ValueError(
            "constraints are not currently supported in multi-fidelity BO"
        )

    return v

xopt.generators.bayesian.turbo.TurboController

TurboController(vocs, **kwargs)

Bases: XoptBaseModel, ABC

Base class for TuRBO (Trust Region Bayesian Optimization) controllers.

Attributes:

vocs : VOCS The VOCS (Variables, Objectives, Constraints, Statics) object. dim : PositiveInt The dimensionality of the optimization problem. batch_size : PositiveInt Number of trust regions to use. length : float Base length of the trust region. length_min : PositiveFloat Minimum base length of the trust region. length_max : PositiveFloat Maximum base length of the trust region. failure_counter : int Number of failures since reset. failure_tolerance : PositiveInt Number of failures to trigger a trust region expansion. success_counter : int Number of successes since reset. success_tolerance : PositiveInt Number of successes to trigger a trust region contraction. center_x : Optional[Dict[str, float]] Center point of the trust region. scale_factor : float Multiplier to increase or decrease the trust region. restrict_model_data : Optional[bool] Flag to restrict model data to within the trust region. model_config : ConfigDict Configuration dictionary for the model.

Methods:

get_trust_region(self, generator) -> Tensor Return the trust region based on the generator. update_trust_region(self) Update the trust region based on success and failure counters. get_data_in_trust_region(self, data: pd.DataFrame, generator) Get subset of data in the trust region. update_state(self, generator, previous_batch_size: int = 1) -> None Abstract method to update the state of the controller. reset(self) Reset the controller to the initial state.

Source code in xopt/generators/bayesian/turbo.py
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
def __init__(self, vocs: VOCS, **kwargs):
    dim = vocs.n_variables

    super(TurboController, self).__init__(vocs=vocs, dim=dim, **kwargs)

    # initialize tolerances if not specified
    if self.failure_tolerance is None:
        self.failure_tolerance = int(
            math.ceil(
                max(
                    [2.0 / self.batch_size, float(self.dim) / 2.0 * self.batch_size]
                )
            )
        )

    if self.success_tolerance is None:
        self.success_tolerance = int(
            math.ceil(
                max(
                    [2.0 / self.batch_size, float(self.dim) / 2.0 * self.batch_size]
                )
            )
        )

    # get the initial state for the turbo controller for resetting
    self._initial_state = self.model_dump()

Functions

xopt.generators.bayesian.turbo.TurboController.get_data_in_trust_region
get_data_in_trust_region(data, generator)

Get subset of data in the trust region.

Parameters:

Name Type Description Default
data DataFrame

The data to filter.

required
generator BayesianGenerator

The generator used to determine the trust region.

required

Returns:

Type Description
DataFrame

The subset of data within the trust region.

Source code in xopt/generators/bayesian/turbo.py
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
def get_data_in_trust_region(self, data: pd.DataFrame, generator):
    """
    Get subset of data in the trust region.

    Parameters
    ----------
    data : pd.DataFrame
        The data to filter.
    generator : BayesianGenerator
        The generator used to determine the trust region.

    Returns
    -------
    pd.DataFrame
        The subset of data within the trust region.
    """
    variable_data = torch.tensor(self.vocs.variable_data(data).to_numpy())

    bounds = self.get_trust_region(generator)

    mask = torch.ones(len(variable_data), dtype=torch.bool)
    for dim in range(variable_data.shape[1]):
        mask &= (variable_data[:, dim] >= bounds[0][dim]) & (
            variable_data[:, dim] <= bounds[1][dim]
        )

    return data.iloc[mask.numpy()]
xopt.generators.bayesian.turbo.TurboController.get_trust_region
get_trust_region(generator)

Return the trust region based on the generator. The trust region is a rectangular region around a center point. The sides of the trust region are given by the length parameter and are scaled according to the generator model lengthscales (if available).

Parameters:

Name Type Description Default
generator BayesianGenerator

Generator object used to supply the model and datatypes for the returned trust region.

required

Returns:

Type Description
Tensor

The trust region bounds.

Source code in xopt/generators/bayesian/turbo.py
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
def get_trust_region(self, generator) -> Tensor:
    """
    Return the trust region based on the generator. The trust region is a
    rectangular region around a center point. The sides of the trust region are
    given by the `length` parameter and are scaled according to the generator
    model lengthscales (if available).

    Parameters
    ----------
    generator : BayesianGenerator
        Generator object used to supply the model and datatypes for the returned
        trust region.

    Returns
    -------
    Tensor
        The trust region bounds.
    """
    model = generator.model
    bounds = torch.tensor(self.vocs.bounds, **generator.tkwargs)

    if self.center_x is not None:
        # get bounds width
        bound_widths = bounds[1] - bounds[0]

        # Scale the TR to be proportional to the lengthscales of the objective model
        x_center = torch.tensor(
            [self.center_x[ele] for ele in self.vocs.variable_names],
            **generator.tkwargs,
        ).unsqueeze(dim=0)

        # default weights are 1 (if there is no model or a model without
        # lengthscales)
        weights = 1.0

        if model is not None:
            if model.models[0].covar_module.lengthscale is not None:
                lengthscales = model.models[0].covar_module.lengthscale.detach()

                # calculate the ratios of lengthscales for each axis
                weights = lengthscales / torch.prod(lengthscales) ** (1 / self.dim)

        # calculate the tr bounding box
        tr_lb = torch.clamp(
            x_center - weights * self.length * bound_widths / 2.0, *bounds
        )
        tr_ub = torch.clamp(
            x_center + weights * self.length * bound_widths / 2.0, *bounds
        )
        return torch.cat((tr_lb, tr_ub), dim=0)
    else:
        return bounds
xopt.generators.bayesian.turbo.TurboController.reset
reset()

Reset the controller to the initial state.

Source code in xopt/generators/bayesian/turbo.py
245
246
247
248
249
250
251
def reset(self):
    """
    Reset the controller to the initial state.
    """
    for name, val in self._initial_state.items():
        if not name == "name":
            self.__setattr__(name, val)
xopt.generators.bayesian.turbo.TurboController.update_state abstractmethod
update_state(generator, previous_batch_size=1)

Abstract method to update the state of the controller.

Parameters:

Name Type Description Default
generator BayesianGenerator

The generator used to update the state.

required
previous_batch_size int

The number of candidates in the previous batch evaluation, by default 1.

1
Source code in xopt/generators/bayesian/turbo.py
231
232
233
234
235
236
237
238
239
240
241
242
243
@abstractmethod
def update_state(self, generator, previous_batch_size: int = 1) -> None:
    """
    Abstract method to update the state of the controller.

    Parameters
    ----------
    generator : BayesianGenerator
        The generator used to update the state.
    previous_batch_size : int, optional
        The number of candidates in the previous batch evaluation, by default 1.
    """
    pass
xopt.generators.bayesian.turbo.TurboController.update_trust_region
update_trust_region()

Update the trust region based on success and failure counters.

Source code in xopt/generators/bayesian/turbo.py
192
193
194
195
196
197
198
199
200
201
def update_trust_region(self):
    """
    Update the trust region based on success and failure counters.
    """
    if self.success_counter == self.success_tolerance:  # Expand trust region
        self.length = min(self.scale_factor * self.length, self.length_max)
        self.success_counter = 0
    elif self.failure_counter == self.failure_tolerance:  # Shrink trust region
        self.length = max(self.length / self.scale_factor, self.length_min)
        self.failure_counter = 0
  • get_trust_region
  • update_trust_region
  • get_data_in_trust_region
  • update_state
  • reset

xopt.generators.bayesian.turbo.OptimizeTurboController

OptimizeTurboController(vocs, **kwargs)

Bases: TurboController

Turbo controller for optimization tasks.

Attributes:

name : str The name of the controller. best_value : Optional[float] The best value found so far.

Methods:

vocs_validation(cls, info) Validate the VOCS for the controller. minimize(self) -> bool Check if the objective is to minimize. _set_best_point_value(self, data) Set the best point value based on the data. update_state(self, generator, previous_batch_size: int = 1) -> None Update the state of the controller.

Source code in xopt/generators/bayesian/turbo.py
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
def __init__(self, vocs: VOCS, **kwargs):
    dim = vocs.n_variables

    super(TurboController, self).__init__(vocs=vocs, dim=dim, **kwargs)

    # initialize tolerances if not specified
    if self.failure_tolerance is None:
        self.failure_tolerance = int(
            math.ceil(
                max(
                    [2.0 / self.batch_size, float(self.dim) / 2.0 * self.batch_size]
                )
            )
        )

    if self.success_tolerance is None:
        self.success_tolerance = int(
            math.ceil(
                max(
                    [2.0 / self.batch_size, float(self.dim) / 2.0 * self.batch_size]
                )
            )
        )

    # get the initial state for the turbo controller for resetting
    self._initial_state = self.model_dump()

Functions

xopt.generators.bayesian.turbo.OptimizeTurboController.update_state
update_state(generator, previous_batch_size=1)

Update turbo state class using min of data points that are feasible. If no points in the data set are feasible raise an error.

NOTE: this is the opposite of botorch which assumes maximization, xopt assumes minimization

Parameters:

Name Type Description Default
generator BayesianGenerator

Entire data set containing previous measurements. Requires at least one valid point.

required
previous_batch_size int

Number of candidates in previous batch evaluation

= 1

Returns:

Type Description
None
Source code in xopt/generators/bayesian/turbo.py
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
def update_state(self, generator, previous_batch_size: int = 1) -> None:
    """
    Update turbo state class using min of data points that are feasible.
    If no points in the data set are feasible raise an error.

    NOTE: this is the opposite of botorch which assumes maximization, xopt assumes
    minimization

    Parameters
    ----------
    generator : BayesianGenerator
        Entire data set containing previous measurements. Requires at least one
        valid point.

    previous_batch_size : int, default = 1
        Number of candidates in previous batch evaluation

    Returns
    -------
    None
    """
    data = generator.data

    # get locations of valid data samples
    feas_data = self.vocs.feasibility_data(data)

    if len(data[feas_data["feasible"]]) == 0:
        raise RuntimeError(
            "turbo requires at least one valid point in the training dataset"
        )
    else:
        self._set_best_point_value(data[feas_data["feasible"]])

    # get feasibility of last `n_candidates`
    recent_data = data.iloc[-previous_batch_size:]
    f_data = self.vocs.feasibility_data(recent_data)
    recent_f_data = recent_data[f_data["feasible"]]
    recent_f_data_minform = self.vocs.objective_data(recent_f_data, "")

    # if none of the candidates are valid count this as a failure
    if len(recent_f_data) == 0:
        self.success_counter = 0
        self.failure_counter += 1

    else:
        # if we had previous feasible points we need to compare with previous
        # best values, NOTE: this is the opposite of botorch which assumes
        # maximization, xopt assumes minimization
        Y_last = recent_f_data_minform[self.vocs.objective_names[0]].min()
        best_value = self.best_value if self.minimize else -self.best_value

        # note: add in small tolerance to account for numerical issues
        if Y_last <= best_value + 1e-40:
            self.success_counter += 1
            self.failure_counter = 0
        else:
            self.success_counter = 0
            self.failure_counter += 1

    self.update_trust_region()
  • minimize

xopt.generators.bayesian.turbo.SafetyTurboController

SafetyTurboController(vocs, **kwargs)

Bases: TurboController

Turbo controller for safety-constrained optimization tasks.

Attributes:

name : str The name of the controller. scale_factor : PositiveFloat Multiplier to increase or decrease the trust region. min_feasible_fraction : PositiveFloat Minimum feasible fraction to trigger trust region expansion.

Methods:

vocs_validation(cls, info) Validate the VOCS for the controller. update_state(self, generator, previous_batch_size: int = 1) Update the state of the controller.

Notes:

The trust region of the safety turbo controller is expanded or contracted based on the feasibility of the observed points. In cases where multiple samples are taken at once, the feasibility fraction is calculated based on the last previous_batch_size samples. If the feasibility fraction is above min_feasible_fraction, the observation is considered a success, otherwise it is a failure.

Source code in xopt/generators/bayesian/turbo.py
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
def __init__(self, vocs: VOCS, **kwargs):
    dim = vocs.n_variables

    super(TurboController, self).__init__(vocs=vocs, dim=dim, **kwargs)

    # initialize tolerances if not specified
    if self.failure_tolerance is None:
        self.failure_tolerance = int(
            math.ceil(
                max(
                    [2.0 / self.batch_size, float(self.dim) / 2.0 * self.batch_size]
                )
            )
        )

    if self.success_tolerance is None:
        self.success_tolerance = int(
            math.ceil(
                max(
                    [2.0 / self.batch_size, float(self.dim) / 2.0 * self.batch_size]
                )
            )
        )

    # get the initial state for the turbo controller for resetting
    self._initial_state = self.model_dump()

Functions

xopt.generators.bayesian.turbo.SafetyTurboController.update_state
update_state(generator, previous_batch_size=1)

Update the state of the controller.

Parameters:

Name Type Description Default
generator BayesianGenerator

The generator used to update the state.

required
previous_batch_size int

The number of candidates in the previous batch evaluation, by default 1.

1
Source code in xopt/generators/bayesian/turbo.py
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
def update_state(self, generator, previous_batch_size: int = 1):
    """
    Update the state of the controller.

    Parameters
    ----------
    generator : BayesianGenerator
        The generator used to update the state.
    previous_batch_size : int, optional
        The number of candidates in the previous batch evaluation, by default 1.
    """
    data = generator.data

    # set center point to be mean of valid data points
    feas = data[self.vocs.feasibility_data(data)["feasible"]]
    self.center_x = feas[self.vocs.variable_names].mean().to_dict()

    # get the feasibility fractions of the last batch
    last_batch = self.vocs.feasibility_data(data).iloc[-previous_batch_size:]
    feas_fraction = last_batch["feasible"].sum() / len(last_batch)

    if feas_fraction > self.min_feasible_fraction:
        self.success_counter += 1
        self.failure_counter = 0
    else:
        self.success_counter = 0
        self.failure_counter += 1

    self.update_trust_region()

xopt.generators.bayesian.turbo.EntropyTurboController

EntropyTurboController(vocs, **kwargs)

Bases: TurboController

Turbo controller for entropy-based optimization tasks.

Attributes:

name : str The name of the controller. _best_entropy : float The best entropy value found so far.

Methods:

update_state(self, generator, previous_batch_size: int = 1) -> None Update the state of the controller.

Source code in xopt/generators/bayesian/turbo.py
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
def __init__(self, vocs: VOCS, **kwargs):
    dim = vocs.n_variables

    super(TurboController, self).__init__(vocs=vocs, dim=dim, **kwargs)

    # initialize tolerances if not specified
    if self.failure_tolerance is None:
        self.failure_tolerance = int(
            math.ceil(
                max(
                    [2.0 / self.batch_size, float(self.dim) / 2.0 * self.batch_size]
                )
            )
        )

    if self.success_tolerance is None:
        self.success_tolerance = int(
            math.ceil(
                max(
                    [2.0 / self.batch_size, float(self.dim) / 2.0 * self.batch_size]
                )
            )
        )

    # get the initial state for the turbo controller for resetting
    self._initial_state = self.model_dump()

Functions

xopt.generators.bayesian.turbo.EntropyTurboController.update_state
update_state(generator, previous_batch_size=1)

Update the state of the controller.

Parameters:

Name Type Description Default
generator BayesianGenerator

The generator used to update the state.

required
previous_batch_size int

The number of candidates in the previous batch evaluation, by default 1.

1
Source code in xopt/generators/bayesian/turbo.py
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
def update_state(self, generator, previous_batch_size: int = 1) -> None:
    """
    Update the state of the controller.

    Parameters
    ----------
    generator : BayesianGenerator
        The generator used to update the state.
    previous_batch_size : int, optional
        The number of candidates in the previous batch evaluation, by default 1.
    """
    if generator.algorithm_results is not None:
        # check to make sure required keys are in algorithm results
        for ele in ["solution_center", "solution_entropy"]:
            if ele not in generator.algorithm_results:
                raise RuntimeError(
                    f"algorithm must include `{ele}` in "
                    f"`algorithm_results` property to use "
                    f"EntropyTurboController"
                )

        self.center_x = dict(
            zip(
                self.vocs.variable_names,
                generator.algorithm_results["solution_center"],
            )
        )
        entropy = generator.algorithm_results["solution_entropy"]

        if self._best_entropy is not None:
            if entropy < self._best_entropy:
                self.success_counter += 1
                self.failure_counter = 0
                self._best_entropy = entropy
            else:
                self.success_counter = 0
                self.failure_counter += 1

            self.update_trust_region()
        else:
            self._best_entropy = entropy

xopt.generators.bayesian.bax_generator.BaxGenerator

BaxGenerator(**kwargs)

Bases: BayesianGenerator

BAX Generator for Bayesian optimization.

Attributes:

name : str The name of the generator. algorithm : Algorithm Algorithm evaluated in the BAX process. algorithm_results : Dict Dictionary results from the algorithm. algorithm_results_file : str File name to save algorithm results at every step. _n_calls : int Internal counter for the number of calls to the generate method.

Methods:

validate_turbo_controller(cls, value, info: ValidationInfo) -> Any Validate the turbo controller. validate_vocs(cls, v, info: ValidationInfo) -> VOCS Validate the VOCS object. generate(self, n_candidates: int) -> List[Dict] Generate a specified number of candidate samples. _get_acquisition(self, model) -> ModelListExpectedInformationGain Get the acquisition function.

Source code in xopt/generator.py
102
103
104
105
106
107
108
def __init__(self, **kwargs):
    """
    Initialize the generator.

    """
    super().__init__(**kwargs)
    logger.info(f"Initialized generator {self.name}")

Functions

xopt.generators.bayesian.bax_generator.BaxGenerator.generate
generate(n_candidates)

Generate a specified number of candidate samples.

Parameters:

n_candidates : int The number of candidate samples to generate.

Returns:

List[Dict] A list of dictionaries containing the generated samples.

Source code in xopt/generators/bayesian/bax_generator.py
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
def generate(self, n_candidates: int) -> List[Dict]:
    """
    Generate a specified number of candidate samples.

    Parameters:
    -----------
    n_candidates : int
        The number of candidate samples to generate.

    Returns:
    --------
    List[Dict]
        A list of dictionaries containing the generated samples.
    """
    self._n_calls += 1
    return super().generate(n_candidates)