Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/34j/boost-loss
Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost.
https://github.com/34j/boost-loss
autograd catboost custom-loss custom-loss-functions evaluation-metrics gbdt gradient-boosting hacktoberfest lightgbm pytorch scikit-learn sklearn sklearn-compatible xgboost
Last synced: 3 months ago
JSON representation
Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost.
- Host: GitHub
- URL: https://github.com/34j/boost-loss
- Owner: 34j
- License: mit
- Created: 2023-05-21T04:30:31.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-10-21T17:06:59.000Z (3 months ago)
- Last Synced: 2024-10-22T06:40:17.312Z (3 months ago)
- Topics: autograd, catboost, custom-loss, custom-loss-functions, evaluation-metrics, gbdt, gradient-boosting, hacktoberfest, lightgbm, pytorch, scikit-learn, sklearn, sklearn-compatible, xgboost
- Language: Python
- Homepage:
- Size: 413 KB
- Stars: 8
- Watchers: 2
- Forks: 0
- Open Issues: 13
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- Funding: .github/FUNDING.yml
- License: LICENSE
- Code of conduct: .github/CODE_OF_CONDUCT.md
- Security: .github/SECURITY.md
Awesome Lists containing this project
README
# Boost Loss
Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost. This sounds very simple, but in reality it took a lot of work.
## Installation
Install this via pip (or your favourite package manager):
```shell
pip install boost-loss
```## Usage
### Basic Usage
```python
import numpy as npfrom boost_loss import LossBase
from numpy.typing import NDArrayclass L2Loss(LossBase):
def loss(self, y_true: NDArray, y_pred: NDArray) -> NDArray:
return (y_true - y_pred) ** 2 / 2def grad(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # dL/dy_pred
return - (y_true - y_pred)def hess(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # d^2L/dy_pred^2
return np.ones_like(y_true)
``````python
import lightgbm as lgbfrom boost_loss import apply_custom_loss
from sklearn.datasets import load_bostonX, y = load_boston(return_X_y=True)
apply_custom_loss(lgb.LGBMRegressor(), L2Loss()).fit(X, y)
```Built-in losses are available. [^bokbokbok]
```python
from boost_loss.regression import LogCoshLoss
```### [`torch.autograd`](https://pytorch.org/docs/stable/autograd.html) Loss [^autograd]
```python
import torchfrom boost_loss.torch import TorchLossBase
class L2LossTorch(TorchLossBase):
def loss_torch(self, y_true: torch.Tensor, y_pred: torch.Tensor) -> torch.Tensor:
return (y_true - y_pred) ** 2 / 2
```## Contributors ✨
Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!
[^bokbokbok]: Inspired by [orchardbirds/bokbokbok](https://github.com/orchardbirds/bokbokbok)
[^autograd]: Inspired by [TomerRonen34/treeboost_autograd](https://github.com/TomerRonen34/treeboost_autograd)