{"id":30284947,"url":"https://github.com/byteb8/hypertuning","last_synced_at":"2025-08-16T19:38:45.798Z","repository":{"id":305329794,"uuid":"781580198","full_name":"byteB8/HyperTuning","owner":"byteB8","description":"Different techniques to tune the hyperparameter of machine learning models. ","archived":false,"fork":false,"pushed_at":"2024-04-03T17:17:19.000Z","size":73,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-07-19T16:57:11.045Z","etag":null,"topics":["bayesian-optimization","gridsearchcv","hyperopt","hyperparameter-optimization","hyperparameter-tuning","optuna","randomizedsearchcv"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/byteB8.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-04-03T16:46:16.000Z","updated_at":"2024-04-03T17:21:17.000Z","dependencies_parsed_at":"2025-07-19T16:57:16.433Z","dependency_job_id":"c937fc3a-da1c-4191-8453-05731396ae91","html_url":"https://github.com/byteB8/HyperTuning","commit_stats":null,"previous_names":["byteb8/hypertuning"],"tags_count":null,"template":false,"template_full_name":null,"purl":"pkg:github/byteB8/HyperTuning","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/byteB8%2FHyperTuning","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/byteB8%2FHyperTuning/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/byteB8%2FHyperTuning/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/byteB8%2FHyperTuning/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/byteB8","download_url":"https://codeload.github.com/byteB8/HyperTuning/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/byteB8%2FHyperTuning/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":270763346,"owners_count":24641017,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-16T02:00:11.002Z","response_time":91,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["bayesian-optimization","gridsearchcv","hyperopt","hyperparameter-optimization","hyperparameter-tuning","optuna","randomizedsearchcv"],"created_at":"2025-08-16T19:38:40.929Z","updated_at":"2025-08-16T19:38:45.789Z","avatar_url":"https://github.com/byteB8.png","language":"Python","readme":"# Model Hyperparameter Tuning Techniques Comparison\n\nThis project compares various hyperparameter tuning techniques for optimizing machine learning models. It includes implementations of different tuning methods using popular libraries such as scikit-learn, scikit-optimize (skopt), hyperopt, Optuna, and manual grid search/random search.\n\n## Techniques Implemented\n\n1. **Bayesian Optimization with skopt (`bayesian_search.py`)**:\n   - Utilizes Gaussian process-based Bayesian optimization for hyperparameter tuning.\n   - Defines a custom optimization function using `gp_minimize` from skopt.\n   - Searches the parameter space for RandomForestClassifier hyperparameters.\n   - Uses 5-fold cross-validation for evaluation.\n\n2. **Custom Scoring with scikit-learn (`custom_scoring.py`)**:\n   - Implements hyperparameter tuning using scikit-learn's `RandomizedSearchCV`.\n   - Defines a custom parameter grid and scoring metric (`accuracy`) for RandomForestClassifier.\n   - Performs randomized search over the parameter grid using cross-validation.\n\n3. **Grid Search with scikit-learn (`grid_search.py`)**:\n   - Implements hyperparameter tuning using scikit-learn's `GridSearchCV`.\n   - Searches through a predefined grid of hyperparameters for RandomForestClassifier.\n   - Evaluates model performance using 5-fold cross-validation.\n\n4. **Hyperopt Optimization (`hyperopt_search.py`)**:\n   - Utilizes tree-structured Parzen Estimator (TPE) algorithm for hyperparameter optimization.\n   - Defines a search space and optimization function using Hyperopt library.\n   - Searches for optimal hyperparameters for RandomForestClassifier using 5-fold cross-validation.\n\n5. **Optuna Optimization (`optuna_search.py`)**:\n   - Implements hyperparameter optimization using Optuna library.\n   - Defines an objective function and search space for RandomForestClassifier hyperparameters.\n   - Searches for optimal hyperparameters using Bayesian optimization with TPE sampler.\n\n6. **Random Search with scikit-learn (`random_search.py`)**:\n   - Performs hyperparameter tuning using scikit-learn's `RandomizedSearchCV`.\n   - Searches randomly across the parameter space for RandomForestClassifier.\n   - Utilizes 5-fold cross-validation for evaluation.\n\n## Dataset\n- The project utilizes the train.csv dataset for training machine learning models.\n- The dataset contains features and target variable ('price_range').\n\n## Getting Started\n1. Clone the repository.\n2. Ensure you have the required dependencies installed (`pandas`, `numpy`, `scikit-learn`, `scikit-optimize`, `hyperopt`, `optuna`, `dlib`).\n3. Run each Python script to see the implementation of different hyperparameter tuning techniques.\n4. Compare the results obtained by different methods to understand their performance.\n\n## Conclusion\n- Compare the performance and efficiency of each hyperparameter tuning method based on their results.\n- Consider factors such as computational resources, tuning time, and optimization effectiveness when choosing a method for model tuning in practical applications.\n\n\n🙂 Feel free to contribute, provide feedback, or suggest improvements to the project!\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbyteb8%2Fhypertuning","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbyteb8%2Fhypertuning","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbyteb8%2Fhypertuning/lists"}