https://github.com/stockeh/mlx-optimizers
A collection of optimizers for MLX
https://github.com/stockeh/mlx-optimizers
mlx optimization
Last synced: 2 months ago
JSON representation
A collection of optimizers for MLX
- Host: GitHub
- URL: https://github.com/stockeh/mlx-optimizers
- Owner: stockeh
- License: apache-2.0
- Created: 2024-11-07T14:50:23.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-12-12T01:11:47.000Z (3 months ago)
- Last Synced: 2025-12-15T22:22:26.218Z (3 months ago)
- Topics: mlx, optimization
- Language: Python
- Homepage: https://stockeh.github.io/mlx-optimizers/
- Size: 31 MB
- Stars: 54
- Watchers: 3
- Forks: 3
- Open Issues: 18
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
README
#
[**Documentation**](https://stockeh.github.io/mlx-optimizers/build/html/index.html) |
[**Install**](#install) |
[**Usage**](#usage) |
[**Examples**](#examples) |
[**Contributing**](#contributing)
[](https://pypi.org/project/mlx-optimizers/)
[](https://pypi.org/project/mlx-optimizers/)
A library to experiment with new optimization algorithms in [MLX](https://github.com/ml-explore/mlx).
- **Diverse Exploration**: includes proven and experimental optimizers like DiffGrad, QHAdam, and Muon ([docs](https://stockeh.github.io/mlx-optimizers/build/html/optimizers.html)).
- **Easy Integration**: fully compatible with MLX for straightforward experimentation and downstream adoption.
- **Benchmark Examples**: enables quick testing on classic optimization and machine learning tasks.
The design of mlx-optmizers is largely inspired by [pytorch-optmizer](https://github.com/jettify/pytorch-optimizer/tree/master).
## Install
The reccomended way to install mlx-optimizers is to install the latest stable release through [PyPi](https://pypi.org/project/mlx-optimizers/):
```bash
pip install mlx-optimizers
```
To install mlx-optimizers from source, first clone [the repository](https://github.com/stockeh/mlx-optimizers.git):
```bash
git clone https://github.com/stockeh/mlx-optimizers.git
cd mlx-optimizers
```
Then run
```bash
pip install -e .
```
## Usage
There are a variety of optimizers to choose from (see [docs](https://stockeh.github.io/mlx-optimizers/build/html/optimizers.html)). Each of these inherit the [`mx.optimizers`](https://ml-explore.github.io/mlx/build/html/python/optimizers.html) class from MLX, so the core functionality remains the same. We can simply use the optimizer as follows:
```python
import mlx_optimizers as optim
#... model, grads, etc.
optimizer = optim.DiffGrad(learning_rate=0.001)
optimizer.update(model, grads)
```
## Examples
The [examples](examples) folder offers a non-exhaustive set of demonstrative use cases for mlx-optimizers. This includes classic optimization benchmarks on the Rosenbrock function and training a simple neural net classifier on MNIST.
## Contributing
Interested in adding a new optimizer? Start with verifying it is not already implemented or in development, then open a new [feature request](https://github.com/stockeh/mlx-optimizers/issues/new?assignees=&labels=&projects=&template=feature_request.md&title=)! If you spot a bug, please open a [bug report](https://github.com/stockeh/mlx-optimizers/issues/new?assignees=&labels=&projects=&template=bug_report.md&title=).
Developer? See our [contributing](.github/CONTRIBUTING.md) guide.