Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/localcascadeensemble/lce
Random Forest or XGBoost? It is Time to Explore LCE
https://github.com/localcascadeensemble/lce
classification data-science machine-learning python regression scikit-learn-api
Last synced: 3 months ago
JSON representation
Random Forest or XGBoost? It is Time to Explore LCE
- Host: GitHub
- URL: https://github.com/localcascadeensemble/lce
- Owner: LocalCascadeEnsemble
- License: apache-2.0
- Created: 2022-04-13T08:26:37.000Z (almost 3 years ago)
- Default Branch: main
- Last Pushed: 2023-08-15T20:32:20.000Z (over 1 year ago)
- Last Synced: 2024-10-18T13:15:40.509Z (3 months ago)
- Topics: classification, data-science, machine-learning, python, regression, scikit-learn-api
- Language: Python
- Homepage: https://lce.readthedocs.io/
- Size: 300 KB
- Stars: 67
- Watchers: 5
- Forks: 8
- Open Issues: 2
-
Metadata Files:
- Readme: README.rst
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE
- Code of conduct: .github/CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
.. raw:: html
| **Local Cascade Ensemble (LCE)** is a *high-performing*, *scalable* and *user-friendly* machine learning method for the general tasks of **Classification** and **Regression**.
| In particular, LCE:
- Enhances the prediction performance of Random Forest and XGBoost by combining their strengths and adopting a complementary diversification approach
- Supports parallel processing to ensure scalability
- Handles missing data by design
- Adopts scikit-learn API for the ease of use
- Adheres to scikit-learn conventions to allow interaction with scikit-learn pipelines and model selection tools
- Is released in open source and commercially usable - Apache 2.0 licenseGetting Started
===============This section presents a quick start tutorial showing snippets for you to try out LCE.
Installation
------------You can install LCE from `PyPI `_ with ``pip``::
pip install lcensemble
Or ``conda``::conda install -c conda-forge lcensemble
First Example on Iris Dataset
-----------------------------LCEClassifier accuracy on an Iris test set:
.. code-block:: python
from lce import LCEClassifier
from sklearn.datasets import load_iris
from sklearn.metrics import accuracy_score
from sklearn.model_selection import train_test_split# Load data and generate a train/test split
data = load_iris()
X_train, X_test, y_train, y_test = train_test_split(data.data, data.target, random_state=0)# Train LCEClassifier with default parameters
clf = LCEClassifier(n_jobs=-1, random_state=0)
clf.fit(X_train, y_train)# Make prediction and compute accuracy score
y_pred = clf.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy: {:.1f}%".format(accuracy*100))
.. code-block::
Accuracy: 97.4%Documentation
=============LCE documentation, including API documentation and general examples, can be found `here `_.
Contribute to LCE
=================Your valuable contribution will help make this package more powerful, and better for the community.
There are multiple ways to participate, check out this `page `_!Reference Papers
================LCE originated from a research at `Inria, France `_.
Here are the reference papers:.. [1] Fauvel, K., E. Fromont, V. Masson, P. Faverdin and A. Termier. LCE: An Augmented Combination of Bagging and Boosting in Python. arXiv, 2023
.. [2] Fauvel, K., E. Fromont, V. Masson, P. Faverdin and A. Termier. XEM: An Explainable-by-Design Ensemble Method for Multivariate Time Series Classification. Data Mining and Knowledge Discovery, 36(3):917–957, 2022
.. [3] Fauvel, K., V. Masson, E. Fromont, P. Faverdin and A. Termier. Towards Sustainable Dairy Management - A Machine Learning Enhanced Method for Estrus Detection. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019
If you use LCE, we would appreciate citations.
Contact
=======If you have any question, you can contact me here: `Kevin Fauvel `_.