Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/christophM/rulefit
Python implementation of the rulefit algorithm
https://github.com/christophM/rulefit
Last synced: about 2 months ago
JSON representation
Python implementation of the rulefit algorithm
- Host: GitHub
- URL: https://github.com/christophM/rulefit
- Owner: christophM
- License: mit
- Created: 2015-10-16T15:56:52.000Z (about 9 years ago)
- Default Branch: master
- Last Pushed: 2023-10-08T11:50:48.000Z (about 1 year ago)
- Last Synced: 2024-10-30T20:03:53.512Z (about 2 months ago)
- Language: Python
- Size: 113 KB
- Stars: 410
- Watchers: 17
- Forks: 112
- Open Issues: 28
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-explainable-AI - https://github.com/christophM/rulefit
- awesome-datascience - RuleFit
README
! This package is no longer actively maintained. If you are interested in maintaining this package, please feel free to reach out to me via Github issue !
# RuleFit
Implementation of a rule based prediction algorithm based on [the rulefit algorithm from Friedman and Popescu (PDF)](http://statweb.stanford.edu/~jhf/ftp/RuleFit.pdf)The algorithm can be used for predicting an output vector y given an input matrix X. In the first
step a tree ensemble is generated with gradient boosting. The trees are then used to form
rules, where the paths to each node in each tree form one rule. A rule is a binary decision if
an observation is in a given node, which is dependent on the input features that were used
in the splits. The ensemble of rules together with the original input features are then being input
in a L1-regularized linear model, also called Lasso, which estimates the effects of each rule on
the output target but at the same time estimating many of those effects to zero.You can use rulefit for predicting a numeric response (categorial not yet implemented).
The input has to be a numpy matrix with only numeric values.## Installation
The latest version can be installed from the master branch using pip:
```
pip install git+https://github.com/christophM/rulefit.git
```Another option is to clone the repository and install using `python setup.py install` or `python setup.py develop`.
## Usage
### Train your model:
```python
import numpy as np
import pandas as pdfrom rulefit import RuleFit
boston_data = pd.read_csv("boston.csv", index_col=0)
y = boston_data.medv.values
X = boston_data.drop("medv", axis=1)
features = X.columns
X = X.as_matrix()rf = RuleFit()
rf.fit(X, y, feature_names=features)```
If you want to have influence on the tree generator you can pass the generator as argument:
```python
from sklearn.ensemble import GradientBoostingRegressor
gb = GradientBoostingRegressor(n_estimators=500, max_depth=10, learning_rate=0.01)
rf = RuleFit(gb)rf.fit(X, y, feature_names=features)
```
### Predict
```python
rf.predict(X)
```
### Inspect rules:
```python
rules = rf.get_rules()rules = rules[rules.coef != 0].sort_values("support", ascending=False)
print(rules)
```## Notes
- In contrast to the original paper, the generated trees are always fitted with the same maximum depth.
In the original implementation the maximum depth of the tree are drawn from a distribution each time
- This implementation is in progress. If you find a bug, don't hesitate to contact me.## Changelog
All notable changes to this project will be documented here.### [v0.3] - IN PROGRESS
- set default of exclude_zero_coef to False in get_rules():
- syntax fix (Issue 21)### [v0.2] - 2017-11-24
- Introduces classification for RuleFit
- Adds scaling of variables (Friedscale)
- Allows random size trees for creating rules### [v0.1] - 2016-06-18
- Start changelog and versions