Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/mlr-org/mlr3filters

Filter-based feature selection for mlr3
https://github.com/mlr-org/mlr3filters

feature-selection filter filters mlr mlr3 r r-package variable-importance

Last synced: 1 day ago
JSON representation

Filter-based feature selection for mlr3

Awesome Lists containing this project

README

        

---
output: github_document
---

# mlr3filters

Package website: [release](https://mlr3filters.mlr-org.com/) | [dev](https://mlr3filters.mlr-org.com/dev/)

{mlr3filters} adds feature selection filters to [mlr3](https://mlr3.mlr-org.com).
The implemented filters can be used stand-alone, or as part of a machine learning pipeline in combination with
[mlr3pipelines](https://mlr3pipelines.mlr-org.com) and the [filter operator](https://mlr3pipelines.mlr-org.com/reference/mlr_pipeops_filter.html).

Wrapper methods for feature selection are implemented in [mlr3fselect](https://mlr3fselect.mlr-org.com).
Learners which support the extraction feature importance scores can be combined with a filter from this package for embedded feature selection.

[![r-cmd-check](https://github.com/mlr-org/mlr3filters/actions/workflows/r-cmd-check.yml/badge.svg)](https://github.com/mlr-org/mlr3filters/actions/workflows/r-cmd-check.yml)
[![CRAN Status](https://www.r-pkg.org/badges/version-ago/mlr3filters)](https://cran.r-project.org/package=mlr3filters)
[![StackOverflow](https://img.shields.io/badge/stackoverflow-mlr3-orange.svg)](https://stackoverflow.com/questions/tagged/mlr3)
[![Mattermost](https://img.shields.io/badge/chat-mattermost-orange.svg)](https://lmmisld-lmu-stats-slds.srv.mwn.de/mlr_invite/)

## Installation

CRAN version

```{r eval = FALSE}
install.packages("mlr3filters")
```

Development version

```{r, eval = FALSE}
remotes::install_github("mlr-org/mlr3filters")
```

## Filters

### Filter Example

```{r}
set.seed(1)
library("mlr3")
library("mlr3filters")

task = tsk("sonar")
filter = flt("auc")
head(as.data.table(filter$calculate(task)))
```

### Implemented Filters

```{r echo = FALSE, message=FALSE}
library("mlr3misc")
library("mlr3filters")
library("data.table")

link_cran = function(pkg) {
mlr3misc::map(pkg, function(.x) {
mlr3misc::map_chr(.x, function(.y) {
if (unlist(.y) %in% getOption("defaultPackages")) {
.y
} else {
sprintf("[%1$s](https://cran.r-project.org/package=%1$s)", .y)
}
})
})
}

tab = as.data.table(mlr_filters)[, !c("params", "task_properties")]
tab[, task_types := sapply(task_types, function(x) if (is_scalar_na(x)) "Universal" else paste(capitalize(x), collapse = " & "))]
tab[, feature_types := sapply(feature_types, function(x) paste(capitalize(x), collapse = ", "))]
tab[, packages := sapply(packages, function(x) paste(link_cran(x), collapse = ", "))]

# manually change the task type for specific filters
learner_based = c("performance", "permutation", "importance", "selected_features")
tab[key %in% learner_based, task_types := "Universal"]
tab[key %in% learner_based, packages := ""]

setnames(tab,
old = c("key", "task_types", "feature_types", "packages"),
new = c("Name", "Task Types", "Feature Types", "Package")
)

knitr::kable(tab, format = "markdown")
```

### Variable Importance Filters

The following learners allow the extraction of variable importance and therefore are supported by `FilterImportance`:

```{r echo=FALSE, warning=FALSE}
library("mlr3learners")
tab = as.data.table(mlr_learners)
tab[sapply(properties, is.element, el = "importance"), key]
```

If your learner is not listed here but capable of extracting variable importance from the fitted model, the reason is most likely that it is not yet integrated in the package [mlr3learners](https://github.com/mlr-org/mlr3learners) or the [extra learner extension](https://github.com/mlr-org/mlr3extralearners).
Please open an issue so we can add your package.

Some learners need to have their variable importance measure "activated" during learner creation.
For example, to use the "impurity" measure of Random Forest via the {ranger} package:

```{r}
task = tsk("iris")
lrn = lrn("classif.ranger", seed = 42)
lrn$param_set$values = list(importance = "impurity")

filter = flt("importance", learner = lrn)
filter$calculate(task)
head(as.data.table(filter), 3)
```

### Performance Filter

`FilterPerformance` is a univariate filter method which calls `resample()` with every predictor variable in the dataset and ranks the final outcome using the supplied measure.
Any learner can be passed to this filter with `classif.rpart` being the default.
Of course, also regression learners can be passed if the task is of type "regr".

### Filter-based Feature Selection

In many cases filtering is only one step in the modeling pipeline.
To select features based on filter values, one can use [`PipeOpFilter`](https://mlr3pipelines.mlr-org.com/reference/mlr_pipeops_filter.html) from [mlr3pipelines](https://github.com/mlr-org/mlr3pipelines).

```{r, results='hide'}
library(mlr3pipelines)
task = tsk("spam")

# the `filter.frac` should be tuned
graph = po("filter", filter = flt("auc"), filter.frac = 0.5) %>>%
po("learner", lrn("classif.rpart"))

learner = as_learner(graph)
rr = resample(task, learner, rsmp("holdout"))
```