https://github.com/andrew-torda/simplex
Nelder Meade simplex optimization method
https://github.com/andrew-torda/simplex
simplex
Last synced: 2 months ago
JSON representation
Nelder Meade simplex optimization method
- Host: GitHub
- URL: https://github.com/andrew-torda/simplex
- Owner: andrew-torda
- License: gpl-3.0
- Created: 2020-01-17T10:41:20.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2020-10-05T07:58:10.000Z (over 5 years ago)
- Last Synced: 2024-06-20T11:55:19.221Z (almost 2 years ago)
- Topics: simplex
- Language: Go
- Homepage:
- Size: 60.5 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# simplex
Nelder Mead simplex optimization method in go.
The basic interface is simple. You ask for a new object and then say Run. There is an absurd number of options. You can specify upper and lower bounds for parameters individually, convergence requirements for each parameter and there are two different methods for setting up the initial vertices. The most complicated option invokes a second cost function on the data each time a new best point is found. If one is doing an optimisation for some kind of fitting, you can have a standard cost function that acts on most of the data, but a second function that acts on part of the data for testing (to see recall versus generalisation or overfitting).
## Introduction
There are lots of tests which give a flavour of what one should do. The example_xx_test.go files are real examples. The simplest start is to define your cost function. It takes a slice of float32's as its only argument and returns a float32 and error. A quadratic function might look like this
function cost ( x []float32) (result, error) {
a := x[0] - 5
b := x[1] - 2
return a * a + b * b, nil
}
This would have minima at `x[0] = 5` and `x[1] = 2`. It does not allow for errors, so it returns nil as the second result.
If you need additional arguments, you will need some kind of wrapper. Maybe you have some fixed arguments which are not to be optimized. A closure might be the best approach. There are lots of closures in the tests.
You then need to make a simplex control argument
s := NewSplxCtrl(cost, iniPrm, maxsteps)
where `cost` is the cost function and iniPrm is a slice with initial parameters. Maybe it was set by a line like `iniPrm := []float32{10, 9}` which would set 10 and 9 as the initial values. `maxsteps` is the maximum number of steps. You then run the simplex with a line like
result, err := s.Run(2)
which would run the simplex two times with `maxsteps` steps each time. To get to the optimized parameters, you would `fmt.Println (result.BestPrm)`. To check if convergence was reached, you might say
if result.StopReason != simplex.Converged {
fmt.Println("Did not converge")
}
Really, one should just `go doc -all` to see many more options.
## Differences compared to literature
### errors
Every time we call the cost function, we check for errors. This is a bit unusual since most peoples functions do not break. We have however had cost functions which involve opening and playing with lots of files and doing things that occasionally explode. In case of an error, it will bubble back up to the caller. Most simple numerical functions do not need this, so they can return `x, nil` instead of just `x`.
### Initialisation
The default is to initialise as per the literature. You have a starting point and each subsequent point has a typical length added in exactly one dimension. Alternatively, you can have the simplex surround the initial point. If you have two parameters, this would be a triangle with the start point at its centre.
From `go doc -all`, look for the paragraph on initialisation.