Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
https://github.com/dmlc/xgboost
distributed-systems gbdt gbm gbrt machine-learning xgboost
Last synced: 3 days ago
JSON representation
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
- Host: GitHub
- URL: https://github.com/dmlc/xgboost
- Owner: dmlc
- License: apache-2.0
- Created: 2014-02-06T17:28:03.000Z (almost 11 years ago)
- Default Branch: master
- Last Pushed: 2024-12-04T07:52:09.000Z (8 days ago)
- Last Synced: 2024-12-04T08:25:07.614Z (8 days ago)
- Topics: distributed-systems, gbdt, gbm, gbrt, machine-learning, xgboost
- Language: C++
- Homepage: https://xgboost.readthedocs.io/en/stable/
- Size: 31.1 MB
- Stars: 26,366
- Watchers: 908
- Forks: 8,736
- Open Issues: 461
-
Metadata Files:
- Readme: README.md
- Changelog: NEWS.md
- Funding: .github/FUNDING.yml
- License: LICENSE
- Citation: CITATION
- Security: SECURITY.md
Awesome Lists containing this project
- awesome-data-science-viz - XGboost
- awesome-python-machine-learning-resources - GitHub - 5% open · ⏱️ 25.08.2022): (机器学习框架)
- awesome-llmops - XGBoost - square) | (Training / Frameworks for Training)
- awesome-llm-eval - XGBoost - Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library. (Frameworks-for-Training / Popular-LLM)
- awesome-python-machine-learning - XGBoost - XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. (Uncategorized / Uncategorized)
- awesome-github-star - xgboost
- awesome-list - XGBoost - Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library. (Machine Learning Framework / General Purpose Framework)
- awesome-machine-learning-resources - **[Library
- StarryDivineSky - dmlc/xgboost
- trackawesomelist - XGBoost (⭐26k)
- awesome-production-machine-learning - XGBoost - XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. (Optimized Computation)
- awesome-datascience - XGBoost
- awesome-r - xgboost - Extreme Gradient Boosting, a scalable and fast machine learning library. (Libraries / Machine Learning)
- awesome-r - xgboost - Extreme Gradient Boosting, a scalable and fast machine learning library. (Libraries / Machine Learning)
README
eXtreme Gradient Boosting
===========[![Build Status](https://badge.buildkite.com/aca47f40a32735c00a8550540c5eeff6a4c1d246a580cae9b0.svg?branch=master)](https://buildkite.com/xgboost/xgboost-ci)
[![XGBoost-CI](https://github.com/dmlc/xgboost/workflows/XGBoost-CI/badge.svg?branch=master)](https://github.com/dmlc/xgboost/actions)
[![Documentation Status](https://readthedocs.org/projects/xgboost/badge/?version=latest)](https://xgboost.readthedocs.org)
[![GitHub license](https://dmlc.github.io/img/apache2.svg)](./LICENSE)
[![CRAN Status Badge](https://www.r-pkg.org/badges/version/xgboost)](https://cran.r-project.org/web/packages/xgboost)
[![PyPI version](https://badge.fury.io/py/xgboost.svg)](https://pypi.python.org/pypi/xgboost/)
[![Conda version](https://img.shields.io/conda/vn/conda-forge/py-xgboost.svg)](https://anaconda.org/conda-forge/py-xgboost)
[![Optuna](https://img.shields.io/badge/Optuna-integrated-blue)](https://optuna.org)
[![Twitter](https://img.shields.io/badge/@XGBoostProject--_.svg?style=social&logo=twitter)](https://twitter.com/XGBoostProject)
[![OpenSSF Scorecard](https://api.securityscorecards.dev/projects/github.com/dmlc/xgboost/badge)](https://api.securityscorecards.dev/projects/github.com/dmlc/xgboost)
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/comet-examples/blob/master/integrations/model-training/xgboost/notebooks/how_to_use_comet_with_xgboost_tutorial.ipynb)[Community](https://xgboost.ai/community) |
[Documentation](https://xgboost.readthedocs.org) |
[Resources](demo/README.md) |
[Contributors](CONTRIBUTORS.md) |
[Release Notes](https://xgboost.readthedocs.io/en/latest/changes/index.html)XGBoost is an optimized distributed gradient boosting library designed to be highly ***efficient***, ***flexible*** and ***portable***.
It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework.
XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.
The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, Dask, Spark, PySpark) and can solve problems beyond billions of examples.License
-------
© Contributors, 2021. Licensed under an [Apache-2](https://github.com/dmlc/xgboost/blob/master/LICENSE) license.Contribute to XGBoost
---------------------
XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone.
Checkout the [Community Page](https://xgboost.ai/community).Reference
---------
- Tianqi Chen and Carlos Guestrin. [XGBoost: A Scalable Tree Boosting System](https://arxiv.org/abs/1603.02754). In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
- XGBoost originates from research project at University of Washington.Sponsors
--------
Become a sponsor and get a logo here. See details at [Sponsoring the XGBoost Project](https://xgboost.ai/sponsors). The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).## Open Source Collective sponsors
[![Backers on Open Collective](https://opencollective.com/xgboost/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/xgboost/sponsors/badge.svg)](#sponsors)### Sponsors
[[Become a sponsor](https://opencollective.com/xgboost#sponsor)]### Backers
[[Become a backer](https://opencollective.com/xgboost#backer)]