Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ecpolley/SuperLearner
Current version of the SuperLearner R package
https://github.com/ecpolley/SuperLearner
Last synced: 3 days ago
JSON representation
Current version of the SuperLearner R package
- Host: GitHub
- URL: https://github.com/ecpolley/SuperLearner
- Owner: ecpolley
- Created: 2011-04-16T05:18:51.000Z (over 13 years ago)
- Default Branch: master
- Last Pushed: 2024-02-19T19:35:42.000Z (9 months ago)
- Last Synced: 2024-11-08T06:47:51.233Z (6 days ago)
- Language: R
- Size: 860 KB
- Stars: 271
- Watchers: 18
- Forks: 72
- Open Issues: 16
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# SuperLearner: Prediction model ensembling method
[![CRAN_Status_Badge](http://www.r-pkg.org/badges/version/SuperLearner)](http://cran.r-project.org/web/packages/SuperLearner)
[![Downloads](http://cranlogs.r-pkg.org/badges/SuperLearner)](http://cran.rstudio.com/package=SuperLearner)
[![codecov](https://codecov.io/gh/ecpolley/SuperLearner/branch/master/graph/badge.svg)](https://codecov.io/gh/ecpolley/SuperLearner)This is the current version of the SuperLearner R package (version 2.*).
**Features**
* Automatic optimal predictor ensembling via cross-validation with one line of code.
* Dozens of algorithms: XGBoost, Random Forest, GBM, Lasso, SVM, BART, KNN, Decision Trees, Neural Networks, and more.
* Integrates with [caret](http://github.com/topepo/caret) to support even more algorithms.
* Includes framework to quickly add custom algorithms to the ensemble.
* Visualize the performance of each algorithm using built-in plotting.
* Easily check multiple hyperparameter configurations for each algorithm in the ensemble.
* Add new algorithms or change the default parameters for existing ones.
* Screen variables (feature selection) based on univariate association, Random Forest, Elastic Net, et al. or custom screening algorithms.
* Multicore and multinode parallelization for scalability.
* External cross-validation to estimate the performance of the ensembling predictor.
* Ensemble can optimize for any target metric: mean-squared error, AUC, log likelihood, etc.
* Includes framework to provide custom loss functions and stacking algorithms.### Install the development version from GitHub:
```r
# install.packages("remotes")
remotes::install_github("ecpolley/SuperLearner")
```### Install the current release from CRAN:
```r
install.packages("SuperLearner")
```[devtools]: https://github.com/hadley/devtools
[remotes]: https://cran.r-project.org/web/packages/remotes/index.html
[CRAN]: https://cran.r-project.org/web/packages/SuperLearner/index.html## Examples
SuperLearner makes it trivial to run many algorithms and use the best one or an ensemble.
```r
data(Boston, package = "MASS")set.seed(1)
sl_lib = c("SL.xgboost", "SL.randomForest", "SL.glmnet", "SL.nnet", "SL.ksvm",
"SL.bartMachine", "SL.kernelKnn", "SL.rpartPrune", "SL.lm", "SL.mean")# Fit XGBoost, RF, Lasso, Neural Net, SVM, BART, K-nearest neighbors, Decision Tree,
# OLS, and simple mean; create automatic ensemble.
result = SuperLearner(Y = Boston$medv, X = Boston[, -14], SL.library = sl_lib)# Review performance of each algorithm and ensemble weights.
result# Use external (aka nested) cross-validation to estimate ensemble accuracy.
# This will take a while to run.
result2 = CV.SuperLearner(Y = Boston$medv, X = Boston[, -14], SL.library = sl_lib)# Plot performance of individual algorithms and compare to the ensemble.
plot(result2) + theme_minimal()# Hyperparameter optimization --
# Fit elastic net with 5 different alphas: 0, 0.2, 0.4, 0.6, 0.8, 1.0.
# 0 corresponds to ridge and 1 to lasso.
enet = create.Learner("SL.glmnet", detailed_names = T,
tune = list(alpha = seq(0, 1, length.out = 5)))sl_lib2 = c("SL.mean", "SL.lm", enet$names)
enet_sl = SuperLearner(Y = Boston$medv, X = Boston[, -14], SL.library = sl_lib2)
# Identify the best-performing alpha value or use the automatic ensemble.
enet_sl
```For more detailed examples please review the vignette:
```r
vignette(package = "SuperLearner")
```## References
Polley EC, van der Laan MJ (2010) Super Learner in Prediction. U.C. Berkeley Division of Biostatistics Working Paper Series. Paper 226.
van der Laan, M. J., Polley, E. C. and Hubbard, A. E. (2007) Super Learner. Statistical Applications of Genetics and Molecular Biology, 6, article 25.
van der Laan, M. J., & Rose, S. (2011). Targeted learning: causal inference for observational and experimental data. Springer Science & Business Media.