| mlr_optimizers_random_search {bbotk} | R Documentation |
OptimizerRandomSearch class that implements a simple Random Search.
In order to support general termination criteria and parallelization, we
evaluate points in a batch-fashion of size batch_size. Larger batches mean
we can parallelize more, smaller batches imply a more fine-grained checking
of termination criteria.
This Optimizer can be instantiated via the dictionary
mlr_optimizers or with the associated sugar function opt():
mlr_optimizers$get("random_search")
opt("random_search")
batch_sizeinteger(1)
Maximum number of points to try in a batch.
bbotk::Optimizer -> OptimizerRandomSearch
new()Creates a new instance of this R6 class.
OptimizerRandomSearch$new()
clone()The objects of this class are cloneable with this method.
OptimizerRandomSearch$clone(deep = FALSE)
deepWhether to make a deep clone.
Bergstra J, Bengio Y (2012). “Random Search for Hyper-Parameter Optimization.” Journal of Machine Learning Research, 13(10), 281–305. https://jmlr.csail.mit.edu/papers/v13/bergstra12a.html.
library(paradox)
domain = ParamSet$new(list(ParamDbl$new("x", lower = -1, upper = 1)))
search_space = ParamSet$new(list(ParamDbl$new("x", lower = -1, upper = 1)))
codomain = ParamSet$new(list(ParamDbl$new("y", tags = "minimize")))
objective_function = function(xs) {
list(y = as.numeric(xs)^2)
}
objective = ObjectiveRFun$new(fun = objective_function,
domain = domain,
codomain = codomain)
terminator = trm("evals", n_evals = 10)
instance = OptimInstanceSingleCrit$new(objective = objective,
search_space = search_space,
terminator = terminator)
optimizer = opt("random_search")
# Modifies the instance by reference
optimizer$optimize(instance)
# Returns best scoring evaluation
instance$result
# Allows access of data.table of full path of all evaluations
instance$archive$data()