https://github.com/philiptitus/stroke-prediction
This Project utilizes 3 Decision Tree Algorithms to make stroke Prediction models
https://github.com/philiptitus/stroke-prediction
decision-tree-classifier decision-trees hyperparameter-tuning random-forest-classifier sickit-learn supervised-learning xgboost xgboost-algorithm
Last synced: 2 months ago
JSON representation
This Project utilizes 3 Decision Tree Algorithms to make stroke Prediction models
- Host: GitHub
- URL: https://github.com/philiptitus/stroke-prediction
- Owner: philiptitus
- License: mit
- Created: 2025-03-09T10:21:52.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-03-09T10:24:08.000Z (3 months ago)
- Last Synced: 2025-03-09T11:23:10.761Z (3 months ago)
- Topics: decision-tree-classifier, decision-trees, hyperparameter-tuning, random-forest-classifier, sickit-learn, supervised-learning, xgboost, xgboost-algorithm
- Language: Jupyter Notebook
- Homepage:
- Size: 178 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: Readme.md
- License: LICENSE
Awesome Lists containing this project
README
# 🧠 Stroke Prediction Models
This project utilizes Kaggle's stroke prediction dataset to develop and compare three decision tree-based machine learning models:
1. **XGBoost**
2. **Random Forest**
3. **Decision Tree (scikit-learn)**## 📂 Project Structure
- 📜 **`model.ipynb`** – Jupyter Notebook containing the implementation and comparison of the three models.
- 📜 **`README.md`** – Project documentation.
- 📜 **`requirements.txt`** – List of dependencies required to run the project.## ⚙️ Installation
Ensure you have Python installed, then install the required dependencies using:
```bash
pip install -r requirements.txt
```## 🚀 Usage
1. Open **`model.ipynb`** in **Jupyter Notebook** or **JupyterLab**.
2. Run the notebook cells to train, evaluate, and compare the models.
3. Analyze the results and accuracy metrics.## 📊 Conclusion
Based on the accuracy results:
✅ **XGBoost** provides the highest accuracy.
✅ **Random Forest** performs well but is slightly less accurate than XGBoost.
⚠️ **Decision Tree** has the lowest accuracy among the three models.## 📁 Dataset
The dataset used in this project is the **Stroke Prediction Dataset** from Kaggle. You can access it [here](https://www.kaggle.com).
## 📜 License
This project is licensed under the **MIT License**.