Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mlr-org/mlrmbo
Toolbox for Bayesian Optimization and Model-Based Optimization in R
https://github.com/mlr-org/mlrmbo
bayesian-optimization black-box-optimization hyperparameter-optimization mlr mlrmbo model-based-optimization optimization r r-package
Last synced: 4 days ago
JSON representation
Toolbox for Bayesian Optimization and Model-Based Optimization in R
- Host: GitHub
- URL: https://github.com/mlr-org/mlrmbo
- Owner: mlr-org
- License: other
- Created: 2013-10-23T01:59:42.000Z (about 11 years ago)
- Default Branch: main
- Last Pushed: 2023-09-14T18:33:20.000Z (over 1 year ago)
- Last Synced: 2024-12-11T08:05:04.203Z (11 days ago)
- Topics: bayesian-optimization, black-box-optimization, hyperparameter-optimization, mlr, mlrmbo, model-based-optimization, optimization, r, r-package
- Language: R
- Homepage: https://mlrmbo.mlr-org.com
- Size: 77.5 MB
- Stars: 187
- Watchers: 31
- Forks: 47
- Open Issues: 95
-
Metadata Files:
- Readme: README.Rmd
- License: LICENSE
Awesome Lists containing this project
README
---
output: github_document
---```{r setup, include=FALSE}
library(knitr)
library(gifski)
opts_knit$set(upload.fun = imgur_upload, base.url = NULL) # upload all images to imgur.com
opts_chunk$set(fig.width=5, fig.height=5, cache=TRUE)
```# mlrMBO
Package website: [mlrmbo.mlr-org.com](https://mlrmbo.mlr-org.com/)
Model-based optimization with [mlr](https://github.com/mlr-org/mlr/).
[![tic](https://github.com/mlr-org/mlrMBO/workflows/tic/badge.svg?branch=master)](https://github.com/mlr-org/mlrMBO/actions)
[![CRAN_Status_Badge](https://www.r-pkg.org/badges/version/mlrMBO)](https://cran.r-project.org/package=mlrMBO)
[![Coverage Status](https://img.shields.io/codecov/c/github/mlr-org/mlrMBO/master.svg)](https://codecov.io/github/mlr-org/mlrMBO?branch=master)
[![Monthly RStudio CRAN Downloads](https://cranlogs.r-pkg.org/badges/mlrMBO)](https://CRAN.R-project.org/package=mlrMBO)* [Documentation](https://mlrmbo.mlr-org.com/)
* [Issues, Requests and Bug Tracker](https://github.com/mlr-org/mlrMBO/issues)# Installation
We recommend to install the official release version:
```{r, eval = FALSE}
install.packages("mlrMBO")
```For experimental use you can install the latest development version:
```{r, eval = FALSE}
remotes::install_github("mlr-org/mlrMBO")
```# Introduction
```{r animation, message = FALSE, warning = FALSE, echo=FALSE, eval=TRUE, fig.width=7, fig.height=4, animation.hook='gifski'}
set.seed(2)
library(ggplot2)
library(mlrMBO)
library(animation)
configureMlr(show.learner.output = FALSE)
pause = interactive()
set.seed(1)fn = makeCosineMixtureFunction(1)
obj.fun = convertToMinimization(fn)
# mbo control with defaultsctrl = makeMBOControl()
ctrl = setMBOControlTermination(ctrl, iters = 10L)
ctrl = setMBOControlInfill(ctrl, crit = makeMBOInfillCritEI(), opt = "focussearch", opt.focussearch.points = 500L, opt.restarts = 1L)design = generateDesign(5L, getParamSet(obj.fun), fun = lhs::maximinLHS)
run = exampleRun(obj.fun, design = design,
control = ctrl, points.per.dim = 1000, show.info = TRUE)for(i in 1:10) {
plotExampleRun(run, iters = i, pause = pause, densregion = TRUE, gg.objects = list(theme_bw()))
}
````mlrMBO` is a highly configurable R toolbox for model-based / Bayesian optimization of black-box functions.
Features:
* EGO-type algorithms (Kriging with expected improvement) on purely numerical search spaces, see [Jones et al. (1998)](https://link.springer.com/article/10.1023/A:1008306431147)
* Mixed search spaces with numerical, integer, categorical and subordinate parameters
* Arbitrary parameter transformation allowing to optimize on, e.g., logscale
* Optimization of noisy objective functions
* Multi-Criteria optimization with approximated Pareto fronts
* Parallelization through multi-point batch proposals
* Parallelization on many parallel back-ends and clusters through [batchtools](https://github.com/mllg/batchtools) and [parallelMap](https://github.com/mlr-org/parallelMap)For the *surrogate*, `mlrMBO` allows any regression learner from [`mlr`](https://github.com/mlr-org/mlr), including:
* Kriging aka. Gaussian processes (i.e. `DiceKriging`)
* random Forests (i.e. `randomForest`)
* and many more...Various *infill criteria* (aka. _acquisition functions_) are available:
* Expected improvement (EI)
* Upper/Lower confidence bound (LCB, aka. statistical lower or upper bound)
* Augmented expected improvement (AEI)
* Expected quantile improvement (EQI)
* API for custom infill criteriaObjective functions are created with package [smoof](https://github.com/jakobbossek/smoof), which also offers many test functions for example runs or benchmarks.
Parameter spaces and initial designs are created with package [ParamHelpers](https://github.com/mlr-org/ParamHelpers).
# How to Cite
Please cite our [arxiv paper](https://arxiv.org/abs/1703.03373) (Preprint).
You can get citation info via `citation("mlrMBO")` or copy the following BibTex entry:```bibtex
@article{mlrMBO,
title = {{{mlrMBO}}: {{A Modular Framework}} for {{Model}}-{{Based Optimization}} of {{Expensive Black}}-{{Box Functions}}},
url = {https://arxiv.org/abs/1703.03373},
shorttitle = {{{mlrMBO}}},
archivePrefix = {arXiv},
eprinttype = {arxiv},
eprint = {1703.03373},
primaryClass = {stat},
author = {Bischl, Bernd and Richter, Jakob and Bossek, Jakob and Horn, Daniel and Thomas, Janek and Lang, Michel},
date = {2017-03-09},
}
```Some parts of the package were created as part of other publications.
If you use these parts, please cite the relevant work appropriately:* Multi-point proposals, including the new multi-objective infill criteria: [MOI-MBO: Multiobjective Infill for Parallel Model-Based Optimization](https://doi.org/10.1007/978-3-319-09584-4_17)
* Multi-objective optimization: [Model-Based Multi-objective Optimization: Taxonomy, Multi-Point Proposal, Toolbox and Benchmark](https://doi.org/10.1007/978-3-319-15934-8_5)
* Multi-objective optimization with categorical variables using the random forest as a surrogate: [Multi-objective parameter configuration of machine learning algorithms using model-based optimization](https://doi.org/10.1109/SSCI.2016.7850221)