Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/grachale/predict_pass_exam
Creating AdaBoost classifier with decision trees for predicting whether a student will pass or fail an exam (classification) based on the number of study hours and their scores in the previous exam.
https://github.com/grachale/predict_pass_exam
adaboost cross-validation decision-tree jupyter-notebook matplotlib python scikit-learn seaborn
Last synced: 6 days ago
JSON representation
Creating AdaBoost classifier with decision trees for predicting whether a student will pass or fail an exam (classification) based on the number of study hours and their scores in the previous exam.
- Host: GitHub
- URL: https://github.com/grachale/predict_pass_exam
- Owner: grachale
- Created: 2024-01-21T19:04:28.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2024-01-21T19:13:47.000Z (10 months ago)
- Last Synced: 2024-01-28T13:39:53.781Z (10 months ago)
- Topics: adaboost, cross-validation, decision-tree, jupyter-notebook, matplotlib, python, scikit-learn, seaborn
- Language: Jupyter Notebook
- Homepage:
- Size: 127 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Classification
## Data Source
We will focus on predicting whether a student will pass or fail an exam based on the number of study hours and their scores in the previous exam. We have training data in the file `data/student_exam_data.csv`.## Feature List
- `n_estimators`: This parameter determines the number of weak learners (trees) that will be trained. Increasing the number of estimators generally improves the performance of the model, but it also increases the computational cost.
- `learning_rate`: It is used to slow down training and to prevent overfitting of the model (if less than one).## Model
We will use Adaboost classifier with decision trees from scikit-learn. AdaBoost (Adaptive Boosting) is an ensemble learning method that combines the predictions of multiple weak learners (in our case decision trees) to create a strong learner.