Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/facebookresearch/nevergrad
A Python toolbox for performing gradient-free optimization
https://github.com/facebookresearch/nevergrad
Last synced: 1 day ago
JSON representation
A Python toolbox for performing gradient-free optimization
- Host: GitHub
- URL: https://github.com/facebookresearch/nevergrad
- Owner: facebookresearch
- License: mit
- Created: 2018-11-21T00:33:17.000Z (about 6 years ago)
- Default Branch: main
- Last Pushed: 2024-12-05T17:02:55.000Z (about 1 month ago)
- Last Synced: 2024-12-17T01:37:48.138Z (29 days ago)
- Language: Python
- Homepage: https://facebookresearch.github.io/nevergrad/
- Size: 26.4 MB
- Stars: 3,980
- Watchers: 58
- Forks: 356
- Open Issues: 122
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- awesome-list - Nevergrad - A Python toolbox for performing gradient-free optimization. (Machine Learning Framework / Hyperparameter Search & Gradient-Free Optimization)
- StarryDivineSky - facebookresearch/nevergrad
- awesome-python-machine-learning-resources - GitHub - 30% open · ⏱️ 10.08.2022): (超参数优化和AutoML)
- awesome-production-machine-learning - Nevergrad - Nevergrad is a gradient-free optimisation platform. (Optimized Computation)
README
[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)
# Nevergrad - A gradient-free optimization platform
![Nevergrad](docs/resources/Nevergrad-LogoMark.png)
`nevergrad` is a Python 3.8+ library. It can be installed with:
```
pip install nevergrad
```More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).
You can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).
Minimizing a function using an optimizer (here `NGOpt`) is straightforward:
```python
import nevergrad as ngdef square(x):
return sum((x - .5)**2)optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation.value) # recommended value
>>> [0.49971112 0.5002944]
````nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.
To do this, one can specify the input space:```python
import nevergrad as ngdef fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
# optimal for learning_rate=0.2, batch_size=4, architecture="conv"
return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)# Instrumentation class is used for functions with multiple inputs
# (positional and/or keywords)
parametrization = ng.p.Instrumentation(
# a log-distributed scalar between 0.001 and 1.0
learning_rate=ng.p.Log(lower=0.001, upper=1.0),
# an integer from 1 to 12
batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
# either "conv" or "fc"
architecture=ng.p.Choice(["conv", "fc"])
)optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
recommendation = optimizer.minimize(fake_training)# show the recommended keyword arguments of the function
print(recommendation.kwargs)
>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}
```Learn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!
![Example of optimization](docs/resources/TwoPointsDE.gif)
*Convergence of a population of points to the minima with two-points DE.*
## Documentation
Check out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, so don't hesitate to submit issues and/or pull requests (PRs) to update it and make it clearer!
The last version of our [**data**](https://drive.google.com/file/d/1p8d1bMCDlvWrDIMXP7fT9pJa1cgjH3NM/view?usp=sharing) and the last version of our [**PDF report**](https://tinyurl.com/dagstuhloid).## Citing
```bibtex
@misc{nevergrad,
author = {J. Rapin and O. Teytaud},
title = {{Nevergrad - A gradient-free optimization platform}},
year = {2018},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}
```## License
`nevergrad` is released under the MIT license. See [LICENSE](LICENSE) for additional details about it.
See also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).