Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pierreablin/autoptim
Automatic differentiation + optimization
https://github.com/pierreablin/autoptim
autodiff numpy optimization
Last synced: 2 months ago
JSON representation
Automatic differentiation + optimization
- Host: GitHub
- URL: https://github.com/pierreablin/autoptim
- Owner: pierreablin
- License: mit
- Created: 2019-02-16T11:39:49.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2019-03-21T15:39:40.000Z (almost 6 years ago)
- Last Synced: 2024-11-09T10:48:03.424Z (2 months ago)
- Topics: autodiff, numpy, optimization
- Language: Python
- Homepage:
- Size: 24.4 KB
- Stars: 104
- Watchers: 2
- Forks: 9
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# autoptim: automatic differentiation + optimization
Do you have a new machine learning model that you want to optimize, and do not want to bother computing the gradients? Autoptim is for you.
### Warning:
As of version 0.3, `pytorch` has been replaced with `autograd` for automatic differentiation. It makes the interfacing with numpy even simpler.## Short presentation
Autoptim is a small Python package that blends `autograd`'s automatic differentiation in `scipy.optimize.minimize`.The gradients are computed under the hood using automatic differentiation; the user only provides the objective function:
```python
import numpy as np
from autoptim import minimizedef rosenbrock(x):
return (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2x0 = np.zeros(2)
x_min, _ = minimize(rosenbrock, x0)
print(x_min)>>> [0.99999913 0.99999825]
```It comes with the following features:
- **Natural interfacing with Numpy**: The objective function is written in standard Numpy. The input/ output of `autoptim.minimize` are Numpy arrays.
- **Smart input processing**: `scipy.optimize.minimize` is only meant to deal with one-dimensional arrays as input. In `autoptim`, variables can be multi-dimensional arrays or lists of arrays.
- **Preconditioning**: Preconditioning is a simple way to accelerate minimization, by doing a change of variables. `autoptim` makes preconditioning straightforward.### Disclaimer
This package is meant to be as easy to use as possible. As so, some compromises on the speed of minimization are made.
## Installation
To install, use `pip`:
```
pip install autoptim
```
## Dependencies
- numpy>=1.12
- scipy>=0.18.0
- autograd >= 1.2## Examples
Several examples can be found in `autoptim/tutorials`