- Notifications
You must be signed in to change notification settings - Fork5
Hyperparameter optimization package of the mlr3 ecosystem
License
mlr-org/mlr3tuning
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
mlr3tuning is the hyperparameter optimization package of themlr3 ecosystem. It features highly configurablesearch spaces via theparadoxpackage and finds optimal hyperparameter configurations for any mlr3learner. mlr3tuning workswith several optimization algorithms e.g. Random Search, IteratedRacing, Bayesian Optimization (inmlr3mbo) and Hyperband (inmlr3hyperband). Moreover, itcanautomaticallyoptimize learners and estimate the performance of optimized models withnestedresampling.The package is built on the optimization frameworkbbotk.
mlr3tuning is extended by the following packages.
- mlr3tuningspaces is acollection of search spaces from scientific articles for commonly usedlearners.
- mlr3hyperband adds theHyperband and Successive Halving algorithm.
- mlr3mbo adds BayesianOptimization methods.
There are several sections about hyperparameter optimization in themlr3book.
- Getting started withhyperparameteroptimization.
- An overview of all tuners can be found on ourwebsite.
- Tunea support vector machine on the Sonar data set.
- Learn abouttuningspaces.
- Estimate the model performance withnestedresampling.
- Learn aboutmulti-objectiveoptimization.
- Simultaneously optimize hyperparameters and useearlystoppingwith XGBoost.
Thegalleryfeatures a collection of case studies and demos about optimization.
- Learn more advanced methods with thePractical TuningSeries.
- Learn abouthotstartingmodels.
- Run thedefault hyperparameterconfigurationof learners as a baseline.
- Use theHyperbandoptimizer with different budget parameters.
Thecheatsheetsummarizes the most important functions of mlr3tuning.
Install the last release from CRAN:
install.packages("mlr3tuning")
Install the development version from GitHub:
remotes::install_github("mlr-org/mlr3tuning")
We optimize thecost
andgamma
hyperparameters of a support vectormachine on theSonar dataset.
library("mlr3learners")library("mlr3tuning")learner= lrn("classif.svm",cost= to_tune(1e-5,1e5,logscale=TRUE),gamma= to_tune(1e-5,1e5,logscale=TRUE),kernel="radial",type="C-classification")
We construct a tuning instance with theti()
function. The tuninginstance describes the tuning problem.
instance= ti(task= tsk("sonar"),learner=learner,resampling= rsmp("cv",folds=3),measures= msr("classif.ce"),terminator= trm("none"))instance
## <TuningInstanceBatchSingleCrit>## * State: Not optimized## * Objective: <ObjectiveTuningBatch:classif.svm_on_sonar>## * Search Space:## id class lower upper nlevels## 1: cost ParamDbl -11.51293 11.51293 Inf## 2: gamma ParamDbl -11.51293 11.51293 Inf## * Terminator: <TerminatorNone>
We select a simple grid search as the optimization algorithm.
tuner= tnr("grid_search",resolution=5)tuner
## <TunerBatchGridSearch>: Grid Search## * Parameters: batch_size=1, resolution=5## * Parameter classes: ParamLgl, ParamInt, ParamDbl, ParamFct## * Properties: dependencies, single-crit, multi-crit## * Packages: mlr3tuning, bbotk
To start the tuning, we simply pass the tuning instance to the tuner.
tuner$optimize(instance)
## cost gamma learner_param_vals x_domain classif.ce## 1: 5.756463 -5.756463 <list[4]> <list[2]> 0.1828847
The tuner returns the best hyperparameter configuration and thecorresponding measured performance.
The archive contains all evaluated hyperparameter configurations.
as.data.table(instance$archive)[, .(cost,gamma,classif.ce,batch_nr,resample_result)]
## cost gamma classif.ce batch_nr resample_result## 1: -5.756463 5.756463 0.4663216 1 <ResampleResult>## 2: 5.756463 -5.756463 0.1828847 2 <ResampleResult>## 3: 11.512925 5.756463 0.4663216 3 <ResampleResult>## 4: 5.756463 11.512925 0.4663216 4 <ResampleResult>## 5: -11.512925 -11.512925 0.4663216 5 <ResampleResult>## --- ## 21: -5.756463 -5.756463 0.4663216 21 <ResampleResult>## 22: 11.512925 11.512925 0.4663216 22 <ResampleResult>## 23: -11.512925 11.512925 0.4663216 23 <ResampleResult>## 24: 11.512925 -5.756463 0.1828847 24 <ResampleResult>## 25: 0.000000 -5.756463 0.2402346 25 <ResampleResult>
Themlr3viz package visualizes tuningresults.
library(mlr3viz)autoplot(instance,type="surface")
We fit a final model with optimized hyperparameters to make predictionson new data.
learner$param_set$values=instance$result_learner_param_valslearner$train(tsk("sonar"))
About
Hyperparameter optimization package of the mlr3 ecosystem