Optimization via Random Search
Source:R/OptimizerBatchRandomSearch.R
mlr_optimizers_random_search.Rd
OptimizerBatchRandomSearch
class that implements a simple Random Search.
In order to support general termination criteria and parallelization, we
evaluate points in a batch-fashion of size batch_size
. Larger batches mean
we can parallelize more, smaller batches imply a more fine-grained checking
of termination criteria.
Source
Bergstra J, Bengio Y (2012). “Random Search for Hyper-Parameter Optimization.” Journal of Machine Learning Research, 13(10), 281–305. https://jmlr.csail.mit.edu/papers/v13/bergstra12a.html.
Dictionary
This Optimizer can be instantiated via the dictionary
mlr_optimizers or with the associated sugar function opt()
:
Progress Bars
$optimize()
supports progress bars via the package progressr
combined with a Terminator. Simply wrap the function in
progressr::with_progress()
to enable them. We recommend to use package
progress as backend; enable with progressr::handlers("progress")
.
Super classes
bbotk::Optimizer
-> bbotk::OptimizerBatch
-> OptimizerBatchRandomSearch
Examples
search_space = domain = ps(x = p_dbl(lower = -1, upper = 1))
codomain = ps(y = p_dbl(tags = "minimize"))
objective_function = function(xs) {
list(y = as.numeric(xs)^2)
}
objective = ObjectiveRFun$new(
fun = objective_function,
domain = domain,
codomain = codomain)
instance = OptimInstanceBatchSingleCrit$new(
objective = objective,
search_space = search_space,
terminator = trm("evals", n_evals = 10))
optimizer = opt("random_search")
# modifies the instance by reference
optimizer$optimize(instance)
#> x x_domain y
#> <num> <list> <num>
#> 1: -0.0003790422 <list[1]> 1.43673e-07
# returns best scoring evaluation
instance$result
#> x x_domain y
#> <num> <list> <num>
#> 1: -0.0003790422 <list[1]> 1.43673e-07
# allows access of data.table of full path of all evaluations
as.data.table(instance$archive$data)
#> x y x_domain timestamp batch_nr
#> <num> <num> <list> <POSc> <int>
#> 1: -0.0003790422 1.436730e-07 <list[1]> 2024-11-08 08:22:48 1
#> 2: 0.0521540358 2.720043e-03 <list[1]> 2024-11-08 08:22:48 2
#> 3: 0.1753952475 3.076349e-02 <list[1]> 2024-11-08 08:22:48 3
#> 4: -0.7242249334 5.245018e-01 <list[1]> 2024-11-08 08:22:48 4
#> 5: -0.3280646680 1.076264e-01 <list[1]> 2024-11-08 08:22:48 5
#> 6: 0.7644371558 5.843642e-01 <list[1]> 2024-11-08 08:22:49 6
#> 7: 0.1161836376 1.349864e-02 <list[1]> 2024-11-08 08:22:49 7
#> 8: 0.2979299254 8.876224e-02 <list[1]> 2024-11-08 08:22:49 8
#> 9: -0.1741356030 3.032321e-02 <list[1]> 2024-11-08 08:22:49 9
#> 10: -0.2879966758 8.294209e-02 <list[1]> 2024-11-08 08:22:49 10