Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mainro/xgbtune
a library to tune xgboost models
https://github.com/mainro/xgbtune
automl automl-algorithms gradient-boosting hyperparameter-optimization hyperparameter-tuning machine-learning parameter-tuning tuning-parameters xgboost
Last synced: 8 days ago
JSON representation
a library to tune xgboost models
- Host: GitHub
- URL: https://github.com/mainro/xgbtune
- Owner: MainRo
- Created: 2020-01-30T22:41:19.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2020-02-14T14:47:28.000Z (over 4 years ago)
- Last Synced: 2024-10-31T06:51:38.882Z (15 days ago)
- Topics: automl, automl-algorithms, gradient-boosting, hyperparameter-optimization, hyperparameter-tuning, machine-learning, parameter-tuning, tuning-parameters, xgboost
- Language: Python
- Size: 74.2 KB
- Stars: 8
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.rst
Awesome Lists containing this project
README
==========
XGBTune
==========.. image:: https://badge.fury.io/py/xgbtune.svg
:target: https://badge.fury.io/py/xgbtune.. image:: https://github.com/mainro/xgbtune/workflows/Python%20package/badge.svg
:target: https://github.com/mainro/xgbtune/actions?query=workflow%3A%22Python+package%22
:alt: Github WorkFlows.. image:: https://readthedocs.org/projects/xgbtune/badge/?version=latest
:target: https://xgbtune.readthedocs.io/en/latest/?badge=latest
:alt: Documentation StatusXGBTune is a library for automated XGBoost model tuning. Tuning an XGBoost
model is as simple as a single function call.Get Started
============.. code:: python
from xgbtune import tune_xgb_model
params, round_count = tune_xgb_model(params, x_train, y_train)
Install
========XGBTune is available on PyPi and can be installed with pip:
.. code:: console
pip install xgbtune
Tuning steps
=============The tuning is done in the following steps:
* compute best round
* tune max_depth and min_child_weight
* tune gamma
* re-compute best round
* tune subsample and colsample_bytree
* fine tune subsample and colsample_bytree
* tune alpha and lambda
* tune seedThis steps can be repeated several times. By default, two passes are done.