Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/optuna/optuna
A hyperparameter optimization framework
https://github.com/optuna/optuna
distributed hacktoberfest hyperparameter-optimization machine-learning parallel python
Last synced: 11 days ago
JSON representation
A hyperparameter optimization framework
- Host: GitHub
- URL: https://github.com/optuna/optuna
- Owner: optuna
- License: other
- Created: 2018-02-21T06:12:56.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2024-06-13T08:25:59.000Z (5 months ago)
- Last Synced: 2024-06-13T08:46:26.676Z (5 months ago)
- Topics: distributed, hacktoberfest, hyperparameter-optimization, machine-learning, parallel, python
- Language: Python
- Homepage: https://optuna.org
- Size: 18.4 MB
- Stars: 9,965
- Watchers: 117
- Forks: 971
- Open Issues: 90
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- Funding: .github/FUNDING.yml
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- my-awesome-starred - optuna/optuna - A hyperparameter optimization framework (Python)
- awesome-llmops - Optuna - square) | (AutoML / Profiling)
- awesome-sciml - optuna/optuna: A hyperparameter optimization framework
- awesome-list - Optuna - An automatic hyperparameter optimization software framework, particularly designed for machine learning. (Machine Learning Framework / Hyperparameter Search & Gradient-Free Optimization)
- awesome-production-machine-learning - Optuna - Optuna is an automatic hyperparameter optimisation software framework, particularly designed for machine learning. (AutoML)
- StarryDivineSky - optuna/optuna
- my-awesome - optuna/optuna - optimization,machine-learning,parallel,python pushed_at:2024-10 star:10.8k fork:1.0k A hyperparameter optimization framework (Python)
- awesome-python-machine-learning-resources - GitHub - 7% open · ⏱️ 26.08.2022): (超参数优化和AutoML)
README
# Optuna: A hyperparameter optimization framework
[![Python](https://img.shields.io/badge/python-3.7%20%7C%203.8%20%7C%203.9%20%7C%203.10%20%7C%203.11%20%7C%203.12-blue)](https://www.python.org)
[![pypi](https://img.shields.io/pypi/v/optuna.svg)](https://pypi.python.org/pypi/optuna)
[![conda](https://img.shields.io/conda/vn/conda-forge/optuna.svg)](https://anaconda.org/conda-forge/optuna)
[![GitHub license](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/optuna/optuna)
[![Read the Docs](https://readthedocs.org/projects/optuna/badge/?version=stable)](https://optuna.readthedocs.io/en/stable/)
[![Codecov](https://codecov.io/gh/optuna/optuna/branch/master/graph/badge.svg)](https://codecov.io/gh/optuna/optuna):link: [**Website**](https://optuna.org/)
| :page_with_curl: [**Docs**](https://optuna.readthedocs.io/en/stable/)
| :gear: [**Install Guide**](https://optuna.readthedocs.io/en/stable/installation.html)
| :pencil: [**Tutorial**](https://optuna.readthedocs.io/en/stable/tutorial/index.html)
| :bulb: [**Examples**](https://github.com/optuna/optuna-examples)
| [**Twitter**](https://twitter.com/OptunaAutoML)
| [**LinkedIn**](https://www.linkedin.com/showcase/optuna/)
| [**Medium**](https://medium.com/optuna)*Optuna* is an automatic hyperparameter optimization software framework, particularly designed
for machine learning. It features an imperative, *define-by-run* style user API. Thanks to our
*define-by-run* API, the code written with Optuna enjoys high modularity, and the user of
Optuna can dynamically construct the search spaces for the hyperparameters.## :loudspeaker: News
* **Oct 21, 2024**: We posted [an article](https://medium.com/optuna/an-introduction-to-moea-d-and-examples-of-multi-objective-optimization-comparisons-8630565a4e89) introducing [MOEA/D](https://hub.optuna.org/samplers/moead/) and an example comparison with other optimization methods.
* **Oct 15, 2024**: We posted [an article](https://medium.com/optuna/introducing-a-new-terminator-early-termination-of-black-box-optimization-based-on-expected-9a660774fcdb) about `Terminator`, which is expanded in Optuna 4.0.
* **Sep 18, 2024**: We posted [an article](https://medium.com/optuna/introducing-the-stabilized-journalstorage-in-optuna-4-0-from-mechanism-to-use-case-e320795ffb61) about `JournalStorage`, which is stabilized in Optuna 4.0.
* **Sep 2, 2024**: Optuna 4.0 is available! You can install it by `pip install -U optuna`. Find the latest [here](https://github.com/optuna/optuna/releases) and check [our article](https://medium.com/optuna/optuna-4-0-whats-new-in-the-major-release-3325a8420d10).
* **Aug 30, 2024**: We posted [an article](https://medium.com/optuna/optunahub-a-feature-sharing-platform-for-optuna-now-available-in-official-release-4b99efe9934d) about the official release of [OptunaHub](https://hub.optuna.org/).
* **Aug 28, 2024**: We posted [an article](https://medium.com/optuna/a-natural-gradient-based-optimization-algorithm-registered-on-optunahub-0dbe17cb0f7d) about [implicit natural gradient optimization (`INGO`)](https://hub.optuna.org/samplers/implicit_natural_gradient/), a sampler newly supported in [OptunaHub](https://hub.optuna.org/).## :fire: Key Features
Optuna has modern functionalities as follows:
- [Lightweight, versatile, and platform agnostic architecture](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/001_first.html)
- Handle a wide variety of tasks with a simple installation that has few requirements.
- [Pythonic search spaces](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/002_configurations.html)
- Define search spaces using familiar Python syntax including conditionals and loops.
- [Efficient optimization algorithms](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/003_efficient_optimization_algorithms.html)
- Adopt state-of-the-art algorithms for sampling hyperparameters and efficiently pruning unpromising trials.
- [Easy parallelization](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/004_distributed.html)
- Scale studies to tens or hundreds of workers with little or no changes to the code.
- [Quick visualization](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/005_visualization.html)
- Inspect optimization histories from a variety of plotting functions.## Basic Concepts
We use the terms *study* and *trial* as follows:
- Study: optimization based on an objective function
- Trial: a single execution of the objective functionPlease refer to the sample code below. The goal of a *study* is to find out the optimal set of
hyperparameter values (e.g., `regressor` and `svr_c`) through multiple *trials* (e.g.,
`n_trials=100`). Optuna is a framework designed for automation and acceleration of
optimization *studies*.Sample code with scikit-learn
[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](http://colab.research.google.com/github/optuna/optuna-examples/blob/main/quickstart.ipynb)
```python
import ...# Define an objective function to be minimized.
def objective(trial):# Invoke suggest methods of a Trial object to generate hyperparameters.
regressor_name = trial.suggest_categorical('regressor', ['SVR', 'RandomForest'])
if regressor_name == 'SVR':
svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
regressor_obj = sklearn.svm.SVR(C=svr_c)
else:
rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)regressor_obj.fit(X_train, y_train)
y_pred = regressor_obj.predict(X_val)error = sklearn.metrics.mean_squared_error(y_val, y_pred)
return error # An objective value linked with the Trial object.
study = optuna.create_study() # Create a new study.
study.optimize(objective, n_trials=100) # Invoke optimization of the objective function.
```> [!NOTE]
> More examples can be found in [optuna/optuna-examples](https://github.com/optuna/optuna-examples).
>
> The examples cover diverse problem setups such as multi-objective optimization, constrained optimization, pruning, and distributed optimization.## Installation
Optuna is available at [the Python Package Index](https://pypi.org/project/optuna/) and on [Anaconda Cloud](https://anaconda.org/conda-forge/optuna).
```bash
# PyPI
$ pip install optuna
``````bash
# Anaconda Cloud
$ conda install -c conda-forge optuna
```> [!IMPORTANT]
> Optuna supports Python 3.7 or newer.
>
> Also, we provide Optuna docker images on [DockerHub](https://hub.docker.com/r/optuna/optuna).## Integrations
Optuna has integration features with various third-party libraries. Integrations can be found in [optuna/optuna-integration](https://github.com/optuna/optuna-integration) and the document is available [here](https://optuna-integration.readthedocs.io/en/stable/index.html).
Supported integration libraries
* [Catboost](https://github.com/optuna/optuna-examples/tree/main/catboost/catboost_pruning.py)
* [Dask](https://github.com/optuna/optuna-examples/tree/main/dask/dask_simple.py)
* [fastai](https://github.com/optuna/optuna-examples/tree/main/fastai/fastai_simple.py)
* [Keras](https://github.com/optuna/optuna-examples/tree/main/keras/keras_integration.py)
* [LightGBM](https://github.com/optuna/optuna-examples/tree/main/lightgbm/lightgbm_integration.py)
* [MLflow](https://github.com/optuna/optuna-examples/tree/main/mlflow/keras_mlflow.py)
* [PyTorch](https://github.com/optuna/optuna-examples/tree/main/pytorch/pytorch_simple.py)
* [PyTorch Ignite](https://github.com/optuna/optuna-examples/tree/main/pytorch/pytorch_ignite_simple.py)
* [PyTorch Lightning](https://github.com/optuna/optuna-examples/tree/main/pytorch/pytorch_lightning_simple.py)
* [TensorBoard](https://github.com/optuna/optuna-examples/tree/main/tensorboard/tensorboard_simple.py)
* [TensorFlow](https://github.com/optuna/optuna-examples/tree/main/tensorflow/tensorflow_estimator_integration.py)
* [tf.keras](https://github.com/optuna/optuna-examples/tree/main/tfkeras/tfkeras_integration.py)
* [Weights & Biases](https://github.com/optuna/optuna-examples/tree/main/wandb/wandb_integration.py)
* [XGBoost](https://github.com/optuna/optuna-examples/tree/main/xgboost/xgboost_integration.py)## Web Dashboard
[Optuna Dashboard](https://github.com/optuna/optuna-dashboard) is a real-time web dashboard for Optuna.
You can check the optimization history, hyperparameter importance, etc. in graphs and tables.
You don't need to create a Python script to call [Optuna's visualization](https://optuna.readthedocs.io/en/stable/reference/visualization/index.html) functions.
Feature requests and bug reports are welcome!![optuna-dashboard](https://user-images.githubusercontent.com/5564044/204975098-95c2cb8c-0fb5-4388-abc4-da32f56cb4e5.gif)
`optuna-dashboard` can be installed via pip:
```shell
$ pip install optuna-dashboard
```> [!TIP]
> Please check out the convenience of Optuna Dashboard using the sample code below.Sample code to launch Optuna Dashboard
Save the following code as `optimize_toy.py`.
```python
import optunadef objective(trial):
x1 = trial.suggest_float("x1", -100, 100)
x2 = trial.suggest_float("x2", -100, 100)
return x1 ** 2 + 0.01 * x2 ** 2study = optuna.create_study(storage="sqlite:///db.sqlite3") # Create a new study with database.
study.optimize(objective, n_trials=100)
```Then try the commands below:
```shell
# Run the study specified above
$ python optimize_toy.py# Launch the dashboard based on the storage `sqlite:///db.sqlite3`
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.
```## OptunaHub
[OptunaHub](https://hub.optuna.org/) is a feature-sharing platform for Optuna.
You can use the registered features and publish your packages.### Use registered features
`optunahub` can be installed via pip:
```shell
$ pip install optunahub
```You can load registered module with `optunahub.load_module`.
```python
import optuna
import optunahubdef objective(trial: optuna.Trial) -> float:
x = trial.suggest_float("x", 0, 1)return x
mod = optunahub.load_module("samplers/simulated_annealing")
study = optuna.create_study(sampler=mod.SimulatedAnnealingSampler())
study.optimize(objective, n_trials=20)print(study.best_trial.value, study.best_trial.params)
```For more details, please refer to [the optunahub documentation](https://optuna.github.io/optunahub/).
### Publish your packages
You can publish your package via [optunahub-registry](https://github.com/optuna/optunahub-registry).
See the [OptunaHub tutorial](https://optuna.github.io/optunahub-registry/index.html).## Communication
- [GitHub Discussions] for questions.
- [GitHub Issues] for bug reports and feature requests.[GitHub Discussions]: https://github.com/optuna/optuna/discussions
[GitHub issues]: https://github.com/optuna/optuna/issues## Contribution
Any contributions to Optuna are more than welcome!
If you are new to Optuna, please check the [good first issues](https://github.com/optuna/optuna/labels/good%20first%20issue). They are relatively simple, well-defined, and often good starting points for you to get familiar with the contribution workflow and other developers.
If you already have contributed to Optuna, we recommend the other [contribution-welcome issues](https://github.com/optuna/optuna/labels/contribution-welcome).
For general guidelines on how to contribute to the project, take a look at [CONTRIBUTING.md](./CONTRIBUTING.md).
## Reference
If you use Optuna in one of your research projects, please cite [our KDD paper](https://doi.org/10.1145/3292500.3330701) "Optuna: A Next-generation Hyperparameter Optimization Framework":
BibTeX
```bibtex
@inproceedings{akiba2019optuna,
title={{O}ptuna: A Next-Generation Hyperparameter Optimization Framework},
author={Akiba, Takuya and Sano, Shotaro and Yanase, Toshihiko and Ohta, Takeru and Koyama, Masanori},
booktitle={The 25th ACM SIGKDD International Conference on Knowledge Discovery \& Data Mining},
pages={2623--2631},
year={2019}
}
```## License
MIT License (see [LICENSE](./LICENSE)).
Optuna uses the codes from SciPy and fdlibm projects (see [LICENSE_THIRD_PARTY](./LICENSE_THIRD_PARTY)).