OptimizerRandomSearch class that implements a simple Random Search.

In order to support general termination criteria and parallelization, we evaluate points in a batch-fashion of size batch_size. Larger batches mean we can parallelize more, smaller batches imply a more fine-grained checking of termination criteria.

Source

Bergstra J, Bengio Y (2012). “Random Search for Hyper-Parameter Optimization.” Journal of Machine Learning Research, 13(10), 281--305. https://jmlr.csail.mit.edu/papers/v13/bergstra12a.html.

Dictionary

This Optimizer can be instantiated via the dictionary mlr_optimizers or with the associated sugar function opt():

mlr_optimizers$get("random_search")
opt("random_search")

Parameters

batch_size

integer(1)
Maximum number of points to try in a batch.

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Super class

bbotk::Optimizer -> OptimizerRandomSearch

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage

OptimizerRandomSearch$new()


Method clone()

The objects of this class are cloneable with this method.

Usage

OptimizerRandomSearch$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

library(paradox) domain = ParamSet$new(list(ParamDbl$new("x", lower = -1, upper = 1))) search_space = ParamSet$new(list(ParamDbl$new("x", lower = -1, upper = 1))) codomain = ParamSet$new(list(ParamDbl$new("y", tags = "minimize"))) objective_function = function(xs) { list(y = as.numeric(xs)^2) } objective = ObjectiveRFun$new(fun = objective_function, domain = domain, codomain = codomain) terminator = trm("evals", n_evals = 10) instance = OptimInstanceSingleCrit$new( objective = objective, search_space = search_space, terminator = terminator) optimizer = opt("random_search") # Modifies the instance by reference optimizer$optimize(instance)
#> x x_domain y #> 1: 0.08496082 <list[1]> 0.00721834
# Returns best scoring evaluation instance$result
#> x x_domain y #> 1: 0.08496082 <list[1]> 0.00721834
# Allows access of data.table of full path of all evaluations as.data.table(instance$archive$data)
#> x y x_domain timestamp batch_nr #> 1: -0.45236351 0.204632745 <list[1]> 2021-01-24 15:47:17 1 #> 2: 0.14008990 0.019625181 <list[1]> 2021-01-24 15:47:17 2 #> 3: -0.32856184 0.107952882 <list[1]> 2021-01-24 15:47:17 3 #> 4: 0.19252558 0.037066098 <list[1]> 2021-01-24 15:47:17 4 #> 5: -0.61696394 0.380644499 <list[1]> 2021-01-24 15:47:17 5 #> 6: 0.89552788 0.801970176 <list[1]> 2021-01-24 15:47:17 6 #> 7: 0.08496082 0.007218340 <list[1]> 2021-01-24 15:47:17 7 #> 8: 0.08920679 0.007957851 <list[1]> 2021-01-24 15:47:17 8 #> 9: -0.44280569 0.196076881 <list[1]> 2021-01-24 15:47:17 9 #> 10: -0.10659506 0.011362507 <list[1]> 2021-01-24 15:47:17 10