Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mlr-org/mlr3hyperband
Successive Halving and Hyperband in the mlr3 ecosystem
https://github.com/mlr-org/mlr3hyperband
automl bbotk hyperband hyperparameter-tuning machine-learning mlr3 optimization r tune tuning
Last synced: about 2 months ago
JSON representation
Successive Halving and Hyperband in the mlr3 ecosystem
- Host: GitHub
- URL: https://github.com/mlr-org/mlr3hyperband
- Owner: mlr-org
- License: lgpl-3.0
- Created: 2019-09-11T12:36:02.000Z (over 5 years ago)
- Default Branch: main
- Last Pushed: 2024-04-26T10:40:38.000Z (8 months ago)
- Last Synced: 2024-05-01T09:37:22.074Z (8 months ago)
- Topics: automl, bbotk, hyperband, hyperparameter-tuning, machine-learning, mlr3, optimization, r, tune, tuning
- Language: R
- Homepage: https://mlr3hyperband.mlr-org.com
- Size: 10 MB
- Stars: 18
- Watchers: 12
- Forks: 5
- Open Issues: 7
-
Metadata Files:
- Readme: README.Rmd
- License: LICENSE
Awesome Lists containing this project
README
---
output: github_document
bibliography: references.bib
---```{r, include = FALSE}
library(mlr3misc)
library(utils)
library(mlr3tuningspaces)
library(data.table)
source("R/bibentries.R")
writeLines(toBibtex(bibentries), "references.bib")lgr::get_logger("mlr3")$set_threshold("warn")
lgr::get_logger("bbotk")$set_threshold("warn")
set.seed(0)
options(
datatable.print.nrows = 10,
datatable.print.class = FALSE,
datatable.print.keys = FALSE,
datatable.print.trunc.cols = TRUE,
width = 100)# mute load messages
library(bbotk)
library(mlr3verse)
library(mlr3hyperband)
library(mlr3learners)
```# mlr3hyperband
Package website: [release](https://mlr3hyperband.mlr-org.com/) | [dev](https://mlr3hyperband.mlr-org.com/dev/)
[![r-cmd-check](https://github.com/mlr-org/mlr3hyperband/actions/workflows/r-cmd-check.yml/badge.svg)](https://github.com/mlr-org/mlr3hyperband/actions/workflows/r-cmd-check.yml)
[![CRAN Status](https://www.r-pkg.org/badges/version-ago/mlr3hyperband)](https://cran.r-project.org/package=mlr3hyperband)
[![StackOverflow](https://img.shields.io/badge/stackoverflow-mlr3-orange.svg)](https://stackoverflow.com/questions/tagged/mlr3)
[![Mattermost](https://img.shields.io/badge/chat-mattermost-orange.svg)](https://lmmisld-lmu-stats-slds.srv.mwn.de/mlr_invite/)*mlr3hyperband* adds the optimization algorithms Successive Halving [@jamieson_2016] and Hyperband [@li_2018] to the [mlr3](https://mlr-org.com/) ecosystem.
The implementation in mlr3hyperband features improved scheduling and parallelizes the evaluation of configurations.
The package includes tuners for hyperparameter optimization in [mlr3tuning](https://github.com/mlr-org/mlr3tuning) and optimizers for black-box optimization in [bbotk](https://github.com/mlr-org/bbotk).## Resources
There are several sections about hyperparameter optimization in the [mlr3book](https://mlr3book.mlr-org.com).
The [gallery](https://mlr-org.com/gallery.html) features a series of case studies on Hyperband.
* [Tune](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/) the hyperparameters of XGBoost with Hyperband
* Use data [subsampling](https://mlr-org.com/gallery/series/2023-01-16-hyperband-subsampling/) and Hyperband to optimize a support vector machine.## Installation
Install the last release from CRAN:
```{r, eval = FALSE}
install.packages("mlr3hyperband")
```Install the development version from GitHub:
```{r, eval = FALSE}
remotes::install_github("mlr-org/mlr3hyperband")
```## Examples
We optimize the hyperparameters of an XGBoost model on the [Sonar](https://mlr3.mlr-org.com/reference/mlr_tasks_sonar.html) data set.
The number of boosting rounds `nrounds` is the fidelity parameter.
We tag this parameter with `"budget"` in the search space.```{r}
library(mlr3hyperband)
library(mlr3learners)learner = lrn("classif.xgboost",
nrounds = to_tune(p_int(27, 243, tags = "budget")),
eta = to_tune(1e-4, 1, logscale = TRUE),
max_depth = to_tune(1, 20),
colsample_bytree = to_tune(1e-1, 1),
colsample_bylevel = to_tune(1e-1, 1),
lambda = to_tune(1e-3, 1e3, logscale = TRUE),
alpha = to_tune(1e-3, 1e3, logscale = TRUE),
subsample = to_tune(1e-1, 1)
)
```We use the `tune()` function to run the optimization.
```{r}
instance = tune(
tnr("hyperband", eta = 3),
task = tsk("pima"),
learner = learner,
resampling = rsmp("cv", folds = 3),
measures = msr("classif.ce")
)
```The instance contains the best-performing hyperparameter configuration.
```{r}
instance$result
```The archive contains all evaluated hyperparameter configurations.
Hyperband adds the `"stage"` and `"braket"`.```{r}
as.data.table(instance$archive)[, .(stage, bracket, classif.ce, nrounds)]
```We fit a final model with optimized hyperparameters to make predictions on new data.
```{r}
learner$param_set$values = instance$result_learner_param_vals
learner$train(tsk("sonar"))
```## References