Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/andydevs/optipy
Optimization algorithms written in python
https://github.com/andydevs/optipy
numpy optimization optimization-algorithms
Last synced: 11 days ago
JSON representation
Optimization algorithms written in python
- Host: GitHub
- URL: https://github.com/andydevs/optipy
- Owner: andydevs
- License: gpl-3.0
- Created: 2016-08-11T06:32:44.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2017-02-24T21:16:53.000Z (over 7 years ago)
- Last Synced: 2024-01-28T23:34:38.943Z (10 months ago)
- Topics: numpy, optimization, optimization-algorithms
- Language: Python
- Homepage:
- Size: 28.3 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# OptiPy
An optimization library written only with numpy.
```python
optipy.minimize(lambda x: 0.5*numpy.sum(x**2), 2)
```## Quick Start Guide
Make sure you have numpy installed first.
Define a function (using `lambda` or `def`). This function must contain one input argument (a numpy array) and return a scalar output.
```python
import numpyfunction = lambda x: 0.5*numpy.sum(x**2)
```Use `optipy.minimize` to return the minimum of the function. Pass your function and the number of input arguments to minimize.
```python
import numpy
import optipyfunction = lambda x: 0.5*numpy.sum(x**2)
minimum = optipy.minimize(function, 2)# minimum = approx. [0., 0.]
```Likewise, use `optipy.maximize` to return the maximum of the function.
```python
import numpy
import optipyfunction = lambda x: -0.5*numpy.sum(x**2)
maximum = optipy.maximize(function, 2)# maximum = approx. [0., 0.]
```## Optional Parameters
You can pass additional keyword arguments to the minimize/maximize functions to control their behaviour.
```python
optipy.minimize(func, 2, epsilon=1e-11, delta=1e-3)
```The following parameters are valid:
| Argument | Description | Defaults |
|:---------|:---------------------------------------------|:-----------|
| epsilon | Zero slope threshold | 1e-10 |
| delta | Delta value used to compute gradient/hessian | 1e-6 |
| alpha | Step size, or learning rate | 1e-4 |
| maxiter | Maximum iterations | sys.maxint |## Optimization Algorithms
By default, optipy uses the Batch Gradient Descent algorithm, but also has other algorithms that can be used, either by calling them directly (they use the same interface as minimize and maximize), or by setting `default_optimizer` to the preferred algorithm to be used by `minimize/maximize`.
```python
optipy.default_optimizer = optipy.gradient_descent.stochastic
```Here are the available algorithms:
| Function | Description |
|:-----------------------------------|:-------------------------------------------------------------|
| optipy.gradient_descent.batch | Batch Gradient Descent |
| optipy.gradient_descent.stochastic | Stochastic Gradient Descent |
| optipy.newtonian.pure | Newton's Method. Computes the full Hessian in each iteration |