https://github.com/jlmelville/mize
R Package for Unconstrained Numerical Optimization
https://github.com/jlmelville/mize
conjugate-gradient l-bfgs numerical-optimization r
Last synced: 10 days ago
JSON representation
R Package for Unconstrained Numerical Optimization
- Host: GitHub
- URL: https://github.com/jlmelville/mize
- Owner: jlmelville
- License: other
- Created: 2016-12-23T22:38:22.000Z (almost 9 years ago)
- Default Branch: master
- Last Pushed: 2025-01-25T17:36:29.000Z (9 months ago)
- Last Synced: 2025-10-03T23:58:31.029Z (19 days ago)
- Topics: conjugate-gradient, l-bfgs, numerical-optimization, r
- Language: R
- Homepage:
- Size: 3.13 MB
- Stars: 10
- Watchers: 4
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# mize
[](https://ci.appveyor.com/project/jlmelville/mize)
[](https://codecov.io/github/jlmelville/mize?branch=master)
[](https://github.com/jlmelville/mize/actions)
[](https://cran.r-project.org/package=mize)
[](https://cran.r-project.org/package=mize)
[](https://cran.r-project.org/package=mize)

[](https://github.com/jlmelville/mize)Unconstrained Numerical Optimization Algorithms.
`mize` can be used as a standalone function like the `stats::optim` function,
or can be integrated into other packages by creating a stateful optimizer and
handling the iterations, convergence, logging and so on externally.`mize` knows how to do Broyden-Fletcher-Goldfarb-Shanno (BFGS),
the limited-memory BFGS (L-BFGS), various flavors of Conjugate Gradient (CG),
Nesterov Accelerated Gradient (NAG) and momentum-based methods, among others.## Installing
```R
# Install from CRAN:
install.packages("mize")# Or install the development version from GitHub:
# install.packages("devtools")
devtools::install_github("jlmelville/mize")
```## Documentation
```R
?mize
```There are also some vignettes:
* `mize.Rmd`, which goes through many of the options available.
* `mmds.Rmd`, which does a simple, but non-trivial, application of `mize` to
carry out metric Multi-Dimensional Scaling on the `eurodist` data set.
* `stateful.Rmd`, which demonstrates how to use `mize` statefully, so you can
manually and externally invoke each iteration step.## Examples
```R
# Make a list containing the function and gradient:
rosenbrock_fg <- list(
fn = function(x) { 100 * (x[2] - x[1] * x[1]) ^ 2 + (1 - x[1]) ^ 2 },
gr = function(x) { c( -400 * x[1] * (x[2] - x[1] * x[1]) - 2 * (1 - x[1]),
200 * (x[2] - x[1] * x[1])) })
# Starting point:
rb0 <- c(-1.2, 1)# Minimize using L-BFGS
res <- mize(rb0, rosenbrock_fg, method = "L-BFGS")
# Optimized parameters are in res$par# Or create an optimizer and then loop manually
opt <- make_mize(method = "L-BFGS")
opt <- mize_init(opt, rb0, rosenbrock_fg)par <- rb0
done <- FALSE
iter <- 0
while (!done) {
iter <- iter + 1
res <- mize_step(opt, par, rosenbrock_fg)
par <- res$par
opt <- res$opt
# Look at res$f for current function value
# you get to (i.e. have to) decide when to stop
done <- iter > 30
}
```## See also
The Wolfe line searches use conversion of Mark Schmidt's
[minFunc routines](http://www.cs.ubc.ca/~schmidtm/Software/minFunc.html),
Carl Edward Rasmussen's
[Matlab code](http://learning.eng.cam.ac.uk/carl/code/minimize/) and Dianne
O'Leary's Matlab translation of the
[Moré-Thuente line search](http://www.cs.umd.edu/users/oleary/software/)
algorithm from [MINPACK](http://www.netlib.org/minpack/).I also maintain the [funconstrain](https://github.com/jlmelville/funconstrain) package, which contains a large number of test
problems for numerical optimization. See this [gist](https://gist.github.com/jlmelville/2cb8905edd0dbc23806d3122a7a05c5d) for functions
to use mize with funconstrain.## License
[BSD 2-Clause](https://opensource.org/licenses/BSD-2-Clause).
## Acknowledgments
I am grateful to Hans Werner Borchers, who provided assistance and
encouragement in getting mize onto CRAN.