https://github.com/halacoded/hyperparameter-tuning
Instead of checking every combination like Grid Search or picking randomly like Random Search, Optuna uses a smart method called Bayesian optimization. This means it learns from previous tries to find better settings faster.Part of CODED Data Science Bootcamp
https://github.com/halacoded/hyperparameter-tuning
classification coded descision-tree hyperparameter-tuning kuwait kuwait-codes machine-learning optun
Last synced: 8 months ago
JSON representation
Instead of checking every combination like Grid Search or picking randomly like Random Search, Optuna uses a smart method called Bayesian optimization. This means it learns from previous tries to find better settings faster.Part of CODED Data Science Bootcamp
- Host: GitHub
- URL: https://github.com/halacoded/hyperparameter-tuning
- Owner: halacoded
- Created: 2025-06-05T10:37:13.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2025-06-05T10:40:04.000Z (8 months ago)
- Last Synced: 2025-06-05T11:37:59.045Z (8 months ago)
- Topics: classification, coded, descision-tree, hyperparameter-tuning, kuwait, kuwait-codes, machine-learning, optun
- Language: Jupyter Notebook
- Homepage: https://colab.research.google.com/drive/1GiVP77A5iIJKOpJ01u2cultWzVezcpHY?usp=sharing
- Size: 10.7 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Hyperparameter-Tuning
Instead of checking every combination like Grid Search or picking randomly like Random Search, Optuna uses a smart method called Bayesian optimization. This means it learns from previous tries to find better settings faster.