{"id":13688229,"url":"https://github.com/lacerbi/optimviz","last_synced_at":"2025-08-09T15:05:34.150Z","repository":{"id":97691716,"uuid":"59489356","full_name":"lacerbi/optimviz","owner":"lacerbi","description":"Visualize optimization algorithms in MATLAB.","archived":false,"fork":false,"pushed_at":"2022-02-24T07:33:42.000Z","size":3373,"stargazers_count":150,"open_issues_count":1,"forks_count":33,"subscribers_count":11,"default_branch":"master","last_synced_at":"2025-03-27T23:23:21.324Z","etag":null,"topics":["matlab","optimization","optimization-algorithms","visualization"],"latest_commit_sha":null,"homepage":"","language":"MATLAB","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/lacerbi.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2016-05-23T14:25:16.000Z","updated_at":"2025-01-25T05:53:20.000Z","dependencies_parsed_at":null,"dependency_job_id":"f660050c-ee80-4b8f-a8ba-9f162f92629c","html_url":"https://github.com/lacerbi/optimviz","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lacerbi%2Foptimviz","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lacerbi%2Foptimviz/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lacerbi%2Foptimviz/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lacerbi%2Foptimviz/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/lacerbi","download_url":"https://codeload.github.com/lacerbi/optimviz/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248860218,"owners_count":21173342,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["matlab","optimization","optimization-algorithms","visualization"],"created_at":"2024-08-02T15:01:09.174Z","updated_at":"2025-04-14T10:10:21.957Z","avatar_url":"https://github.com/lacerbi.png","language":"MATLAB","readme":"# OptimViz - Optimizer visualization demo for MATLAB\n\nThis demo visualizes several MATLAB derivative-free optimizers at work on standard test functions. This is purely for demonstration purposes. For a proper benchmark of different MATLAB optimizers, see [[1](https://github.com/lacerbi/optimviz#reference)].\n\n[Follow me on Twitter](https://twitter.com/AcerbiLuigi) for updates about other projects I am involved with, or drop me an email at  \u003cluigi.acerbi@helsinki.fi\u003e to talk about computational modeling, optimization, and (approximate) Bayesian inference.\n\nI have been giving seminars and tutorials on optimization, model fitting, and model comparison around the world (see [here](https://www2.helsinki.fi/en/researchgroups/machine-and-human-intelligence/teaching)).\nIf you are interested in this research, find more on [my group webpage](https://www2.helsinki.fi/en/researchgroups/machine-and-human-intelligence) at the Department of Computer Science of the University of Helsinki, Finland.\n\n## Optimizers\n\nThe optimization algorithms visualized here are:\n\n- BADS (*Bayesian adaptive direct search*), a novel algorithm that combines a direct search approach with local Bayesian optimization ([link](https://github.com/lacerbi/bads));\n- `fminsearch` (Nelder-Mead), the standard simplex method for nonlinear optimization;\n- `fmincon`, a powerful method for constrained optimization based on numerical approximation of the gradient;\n- `ga` (genetic algorithms), a heuristic population-based method for global optimization;\n- MCS (*Multi-level coordinate search*), an advanced method for global optimization ([link](http://www.mat.univie.ac.at/~neum/software/mcs/));\n- CMA-ES (*Covariance matrix adaptation - evolution strategies*), a state-of-the-art method for nonconvex optimization ([link](https://www.lri.fr/~hansen/cmaesintro.html)).\n\n## Examples\n\nWe see here an example on the Rosenbrock banana function:\n\n![demo_opt](https://github.com/lacerbi/optimviz/blob/master/gifs/optimviz-rosenbrock.gif)\n\nWe see how the algorithms react to noise, by adding unit Gaussian noise at each function evaluation:\n\n![demo_opt](https://github.com/lacerbi/optimviz/blob/master/gifs/optimviz-rosenbrock-noisy.gif)\n\nWe see here another noiseless example on the Ackley function:\n\n![demo_opt](https://github.com/lacerbi/optimviz/blob/master/gifs/optimviz-ackley.gif)\n\n\n## Comments\n\n- BADS works well on these examples, which were chosen to show how different algorithms explore the space. More generally, BADS is best for functions with a noisy or jagged landscape, and with non-negligible computational cost (see [here](https://github.com/lacerbi/bads/wiki#which-kind-of-problems-is-bads-suited-for)). BADS is available as a ready-to-use MATLAB toolbox [here](https://github.com/lacerbi/bads).\n- `fminsearch` is a generic optimizer which can deal with simple functions, but it should never be the main choice as there are always better alternatives.\n- `fmincon` is generally superior to most optimizers (and in partcular, to `fminsearch`) on smooth functions. However, `fmincon` deals *very badly* with jagged or noisy landscapes.\n- We are not aware of scenarios in which `ga` is a good off-the-shelf choice for continuous-valued optimization. It is often just barely better than random search.\n- MCS can be a great optimizer, but it is somewhat idiosyncratic (it might converge very quickly to a solution).\n- CMA-ES, despite the poor performance shown here, is a good optimizer *if* allowed a very large number of function evaluations.\n\n## Code\n\nThese animated gifs can be generated via the `optimviz.m` function. You can easily test different optimizers and add other functions.\n\nThe generated animated gifs are uncompressed. We recommend to compress them before using them in any form (e.g., via some [online tool](https://ezgif.com/optimize)).\n\nTo run some of these algorithms you will need MATLAB's [Optimization Toolbox](http://www.mathworks.com/products/optimization/) and [Global Optimization Toolbox](http://www.mathworks.com/products/global-optimization/).\n\n## References\n\nFor more details about the benchmark comparing different MATLAB optimizers on artificial and real applied problems (fitting of computational models), see the following reference:\n\n1. Acerbi, L. \u0026 Ma, W. J. (2017). Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search. In *Advances in Neural Information Processing Systems 30*, pages 1834-1844. ([link](https://papers.nips.cc/paper/6780-practical-bayesian-optimization-for-model-fitting-with-bayesian-adaptive-direct-search), [arXiv preprint](https://arxiv.org/abs/1705.04405))\n\nFor more info about my work in machine learning and computational neuroscience, follow me on Twitter: https://twitter.com/AcerbiLuigi\n\n\n### License\n\n*OptimViz* is released under the terms of the [GNU General Public License v3.0](https://github.com/lacerbi/optimviz/blob/master/LICENSE.txt).\n\n","funding_links":[],"categories":["MATLAB"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flacerbi%2Foptimviz","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flacerbi%2Foptimviz","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flacerbi%2Foptimviz/lists"}