Tuning Spaces

The package mlr3tuningspaces ships with some predefined tuning spaces for hyperparameter optimization. See the respective manual page for the article from which they were extracted.

Example Usage

Load a tuning space for the classification tree learner from the Bischl et al. (2021) article.

library(mlr3verse)

# load learner and set search space
learner = lts(lrn("classif.rpart"))

# retrieve task
task = tsk("pima")

# load tuner and set batch size
tuner = tnr("random_search", batch_size = 10)

# hyperparameter tuning on the pima data set
instance = tune(
  tuner = tnr("grid_search", resolution = 5, batch_size = 25),
  task = task,
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce"),
)

# best performing hyperparameter configuration
instance$result
         cp minbucket  minsplit learner_param_vals  x_domain classif.ce
      <num>     <num>     <num>             <list>    <list>      <num>
1: -9.21034   3.13079 0.6931472          <list[4]> <list[3]>  0.2460938
# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(task)

print(learner)

── <LearnerClassifRpart> (classif.rpart): Classification Tree ──────────────────
• Model: rpart
• Parameters: cp=0.0001, minbucket=22, minsplit=2, xval=0
• Packages: mlr3 and rpart
• Predict Types: [response] and prob
• Feature Types: logical, integer, numeric, factor, and ordered
• Encapsulation: none (fallback: -)
• Properties: importance, missings, multiclass, selected_features, twoclass,
and weights
• Other settings: use_weights = 'use'

References

Bischl, Bernd, Martin Binder, Michel Lang, Tobias Pielok, Jakob Richter, Stefan Coors, Janek Thomas, et al. 2021. “Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges.” arXiv:2107.05847 [Cs, Stat], July. http://arxiv.org/abs/2107.05847.