OptimizerDesignPoints class that implements optimization w.r.t. fixed design points. We simply search over a set of points fully specified by the user. The points in the design are evaluated in order as given.

In order to support general termination criteria and parallelization, we evaluate points in a batch-fashion of size batch_size. Larger batches mean we can parallelize more, smaller batches imply a more fine-grained checking of termination criteria.

## Dictionary

This Optimizer can be instantiated via the dictionary mlr_optimizers or with the associated sugar function opt():

### Method clone()

The objects of this class are cloneable with this method.

OptimizerDesignPoints$clone(deep = FALSE) #### Arguments deep Whether to make a deep clone. ## Examples library(paradox) library(data.table) domain = ParamSet$new(list(ParamDbl$new("x", lower = -1, upper = 1))) search_space = ParamSet$new(list(ParamDbl$new("x", lower = -1, upper = 1))) codomain = ParamSet$new(list(ParamDbl$new("y", tags = "minimize"))) objective_function = function(xs) { list(y = as.numeric(xs)^2) } objective = ObjectiveRFun$new(fun = objective_function,
domain = domain,
codomain = codomain)
terminator = trm("evals", n_evals = 10)
instance = OptimInstanceSingleCrit$new(objective = objective, search_space = search_space, terminator = terminator) design = data.table(x = c(0, 1)) optimizer = opt("design_points", design = design) # Modifies the instance by reference optimizer$optimize(instance)
#>    x  x_domain y
#> 1: 0 <list[1]> 0
# Returns best scoring evaluation
instance$result #> x x_domain y #> 1: 0 <list[1]> 0 # Allows access of data.table of full path of all evaluations instance$archive\$data()
#>    x y  x_domain           timestamp batch_nr
#> 1: 0 0 <list[1]> 2020-10-25 04:09:51        1
#> 2: 1 1 <list[1]> 2020-10-25 04:09:51        2