Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/perpetual-ml/perpetual

A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization
https://github.com/perpetual-ml/perpetual

gbdt gbm gradient-boosted-trees gradient-boosting gradient-boosting-decision-trees kaggle machine-learning python rust

Last synced: 3 months ago
JSON representation

A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization

Awesome Lists containing this project

README

        



[![Python Versions](https://img.shields.io/pypi/pyversions/perpetual.svg?logo=python&logoColor=white)](https://pypi.org/project/perpetual)
[![PyPI Version](https://img.shields.io/pypi/v/perpetual.svg?logo=pypi&logoColor=white)](https://pypi.org/project/perpetual)
[![Crates.io Version](https://img.shields.io/crates/v/perpetual?logo=rust&logoColor=white)](https://crates.io/crates/perpetual)
[![Discord](https://img.shields.io/discord/1247650900214812692?logo=discord&cacheSeconds=10)](https://discord.gg/vADKk9Wr)

# Perpetual

## _A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization_

PerpetualBooster is a gradient boosting machine (GBM) algorithm which doesn't need hyperparameter optimization unlike other GBM algorithms. Similar to AutoML libraries, it has a `budget` parameter. Increasing the `budget` parameter increases the predictive power of the algorithm and gives better results on unseen data. Start with a small budget (e.g. 1.0) and increase it (e.g. 2.0) once you are confident with your features. If you don't see any improvement with further increasing the `budget`, it means that you are already extracting the most predictive power out of your data.

## Benchmark

Hyperparameter optimization usually takes 100 iterations with plain GBM algorithms. PerpetualBooster achieves the same accuracy in a single run. Thus, it achieves around 100x speed-up at the same accuracy with different `budget` levels and with different datasets. The speed-up might be slightly lower or significantly higher than 100x depending on the dataset.

The following table summarizes the results for the [California Housing](https://scikit-learn.org/stable/modules/generated/sklearn.datasets.fetch_california_housing.html) dataset (regression):

| Perpetual budget | LightGBM n_estimators | Perpetual mse | LightGBM mse | Perpetual cpu time | LightGBM cpu time | Speed-up |
| ---------------- | --------------------- | ------------- | ------------ | ------------------ | ----------------- | -------- |
| 1.0 | 100 | 0.192 | 0.192 | 7.6 | 978 | 129x |
| 1.5 | 300 | 0.188 | 0.188 | 21.8 | 3066 | 141x |
| 2.1 | 1000 | 0.185 | 0.186 | 86.0 | 8720 | 101x |

The following table summarizes the results for the [Cover Types](https://scikit-learn.org/stable/modules/generated/sklearn.datasets.fetch_covtype.html) dataset (classification):

| Perpetual budget | LightGBM n_estimators | Perpetual log loss | LightGBM log loss | Perpetual cpu time | LightGBM cpu time | Speed-up |
| ---------------- | --------------------- | ------------------ | ----------------- | ------------------ | ----------------- | -------- |
| 1.0 | 100 | 0.089 | 0.084 | 1653 | 124958 | 76x |

You can reproduce the results using the scripts in the [examples](./python-package/examples) folder.

## Usage

You can use the algorithm like in the example below. Check examples folders for both Rust and Python.

```python
from perpetual import PerpetualBooster

model = PerpetualBooster(objective="SquaredLoss")
model.fit(X, y, budget=1.0)
```

## Documentation

Documentation for the Python API can be found [here](https://perpetual-ml.github.io/perpetual) and for the Rust API [here](https://docs.rs/perpetual/latest/perpetual/).

## Installation

The package can be installed directly from [pypi](https://pypi.org/project/perpetual).

```shell
pip install perpetual
```

To use in a Rust project, add the following to your Cargo.toml file to get the package from [crates.io](https://crates.io/crates/perpetual).

```toml
perpetual = "0.3.8"
```

## Paper

PerpetualBooster prevents overfitting with a generalization algorithm. The paper is work-in-progress to explain how the algorithm works. Check our [blog post](https://perpetual-ml.com/blog/how-perpetual-works) for a high level introduction to the algorithm.