Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mlr-org/mlr3tuning
Hyperparameter optimization package of the mlr3 ecosystem
https://github.com/mlr-org/mlr3tuning
bbotk hyperparameter-optimization hyperparameter-tuning machine-learning mlr3 optimization r r-package tune tuning
Last synced: 8 days ago
JSON representation
Hyperparameter optimization package of the mlr3 ecosystem
- Host: GitHub
- URL: https://github.com/mlr-org/mlr3tuning
- Owner: mlr-org
- License: lgpl-3.0
- Created: 2018-10-09T09:59:47.000Z (about 6 years ago)
- Default Branch: main
- Last Pushed: 2024-10-21T11:55:41.000Z (17 days ago)
- Last Synced: 2024-10-21T16:34:34.956Z (17 days ago)
- Topics: bbotk, hyperparameter-optimization, hyperparameter-tuning, machine-learning, mlr3, optimization, r, r-package, tune, tuning
- Language: R
- Homepage: https://mlr3tuning.mlr-org.com/
- Size: 9.29 MB
- Stars: 54
- Watchers: 17
- Forks: 5
- Open Issues: 6
-
Metadata Files:
- Readme: README.Rmd
- Changelog: NEWS.md
- License: LICENSE
Awesome Lists containing this project
README
---
output: github_document
---```{r, include = FALSE}
lgr::get_logger("mlr3")$set_threshold("warn")
lgr::get_logger("bbotk")$set_threshold("warn")
set.seed(1)
options(
datatable.print.nrows = 10,
datatable.print.class = FALSE,
datatable.print.keys = FALSE,
width = 100)
# mute load messages
library("mlr3tuning")
```# mlr3tuning
Package website: [release](https://mlr3tuning.mlr-org.com/) | [dev](https://mlr3tuning.mlr-org.com/dev/)
[![r-cmd-check](https://github.com/mlr-org/mlr3tuning/actions/workflows/r-cmd-check.yml/badge.svg)](https://github.com/mlr-org/mlr3tuning/actions/workflows/r-cmd-check.yml)
[![CRAN Status](https://www.r-pkg.org/badges/version-ago/mlr3tuning)](https://cran.r-project.org/package=mlr3tuning)
[![StackOverflow](https://img.shields.io/badge/stackoverflow-mlr3-orange.svg)](https://stackoverflow.com/questions/tagged/mlr3)
[![Mattermost](https://img.shields.io/badge/chat-mattermost-orange.svg)](https://lmmisld-lmu-stats-slds.srv.mwn.de/mlr_invite/)*mlr3tuning* is the hyperparameter optimization package of the [mlr3](https://mlr-org.com/) ecosystem.
It features highly configurable search spaces via the [paradox](https://github.com/mlr-org/paradox) package and finds optimal hyperparameter configurations for any mlr3 [learner](https://github.com/mlr-org/mlr3learners).
mlr3tuning works with several optimization algorithms e.g. Random Search, Iterated Racing, Bayesian Optimization (in [mlr3mbo](https://github.com/mlr-org/mlr3mbo)) and Hyperband (in [mlr3hyperband](https://github.com/mlr-org/mlr3hyperband)).
Moreover, it can [automatically](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-autotuner) optimize learners and estimate the performance of optimized models with [nested resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling).
The package is built on the optimization framework [bbotk](https://github.com/mlr-org/bbotk).## Extension packages
mlr3tuning is extended by the following packages.
* [mlr3tuningspaces](https://github.com/mlr-org/mlr3tuningspaces) is a collection of search spaces from scientific articles for commonly used learners.
* [mlr3hyperband](https://github.com/mlr-org/mlr3hyperband) adds the Hyperband and Successive Halving algorithm.
* [mlr3mbo](https://github.com/mlr-org/mlr3mbo) adds Bayesian Optimization methods.## Resources
There are several sections about hyperparameter optimization in the [mlr3book](https://mlr3book.mlr-org.com).
* Getting started with [hyperparameter optimization](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html).
* An overview of all tuners can be found on our [website](https://mlr-org.com/tuners.html).
* [Tune](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-model-tuning) a support vector machine on the Sonar data set.
* Learn about [tuning spaces](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-defining-search-spaces).
* Estimate the model performance with [nested resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling).
* Learn about [multi-objective optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-multi-metrics-tuning).The [gallery](https://mlr-org.com/gallery-all-optimization.html) features a collection of case studies and demos about optimization.
* Learn more advanced methods with the [Practical Tuning Series](https://mlr-org.com/gallery/series/2021-03-09-practical-tuning-series-tune-a-support-vector-machine/).
* Optimize an rpart classification tree with only a [few lines of code](https://mlr-org.com/gallery/optimization/2022-11-10-hyperparameter-optimization-on-the-palmer-penguins/).
* Simultaneously optimize hyperparameters and use [early stopping](https://mlr-org.com/gallery/optimization/2022-11-04-early-stopping-with-xgboost/) with XGBoost.
* Make us of proven [search space](https://mlr-org.com/gallery/optimization/2021-07-06-introduction-to-mlr3tuningspaces/).
* Learn about [hotstarting](https://mlr-org.com/gallery/optimization/2023-01-16-hotstart/) models.
* Run the [default hyperparameter configuration](https://mlr-org.com/gallery/optimization/2023-01-31-default-configuration/) of learners as a baseline.
* Use the [Hyperband](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/) optimizer with different budget parameters.The [cheatsheet](https://cheatsheets.mlr-org.com/mlr3tuning.pdf) summarizes the most important functions of mlr3tuning.
## Installation
Install the last release from CRAN:
```{r eval = FALSE}
install.packages("mlr3tuning")
```Install the development version from GitHub:
```{r eval = FALSE}
remotes::install_github("mlr-org/mlr3tuning")
```## Examples
We optimize the `cost` and `gamma` hyperparameters of a support vector machine on the [Sonar](https://mlr3.mlr-org.com/reference/mlr_tasks_sonar.html) data set.
```{r}
library("mlr3learners")
library("mlr3tuning")learner = lrn("classif.svm",
cost = to_tune(1e-5, 1e5, logscale = TRUE),
gamma = to_tune(1e-5, 1e5, logscale = TRUE),
kernel = "radial",
type = "C-classification"
)
```We construct a tuning instance with the `ti()` function.
The tuning instance describes the tuning problem.```{r}
instance = ti(
task = tsk("sonar"),
learner = learner,
resampling = rsmp("cv", folds = 3),
measures = msr("classif.ce"),
terminator = trm("none")
)
instance
```We select a simple grid search as the optimization algorithm.
```{r}
tuner = tnr("grid_search", resolution = 5)
tuner
```To start the tuning, we simply pass the tuning instance to the tuner.
```{r}
tuner$optimize(instance)
```The tuner returns the best hyperparameter configuration and the corresponding measured performance.
The archive contains all evaluated hyperparameter configurations.
```{r}
as.data.table(instance$archive)[, .(cost, gamma, classif.ce, batch_nr, resample_result)]
```The [mlr3viz](https://mlr3viz.mlr-org.com/) package visualizes tuning results.
```{r, eval=FALSE}
library(mlr3viz)autoplot(instance, type = "surface")
``````{r, eval=FALSE, echo=FALSE}
p = autoplot(instance, type = "surface")
ggplot2::ggsave(filename = "plot.png", plot = p, units = "px", width = 600, height = 500, scale = 5)
```We fit a final model with optimized hyperparameters to make predictions on new data.
```{r}
learner$param_set$values = instance$result_learner_param_vals
learner$train(tsk("sonar"))
```