An open API service indexing awesome lists of open source software.

https://github.com/stockeh/mlx-optimizers

A collection of optimizers for MLX
https://github.com/stockeh/mlx-optimizers

mlx optimization

Last synced: 2 months ago
JSON representation

A collection of optimizers for MLX

Awesome Lists containing this project

README

          


logo
logo

#

[**Documentation**](https://stockeh.github.io/mlx-optimizers/build/html/index.html) |
[**Install**](#install) |
[**Usage**](#usage) |
[**Examples**](#examples) |
[**Contributing**](#contributing)

[![ci](https://github.com/stockeh/mlx-optimizers/workflows/Main/badge.svg)](https://pypi.org/project/mlx-optimizers/)
[![PyPI](https://img.shields.io/pypi/v/mlx-optimizers)](https://pypi.org/project/mlx-optimizers/)

A library to experiment with new optimization algorithms in [MLX](https://github.com/ml-explore/mlx).

- **Diverse Exploration**: includes proven and experimental optimizers like DiffGrad, QHAdam, and Muon ([docs](https://stockeh.github.io/mlx-optimizers/build/html/optimizers.html)).
- **Easy Integration**: fully compatible with MLX for straightforward experimentation and downstream adoption.
- **Benchmark Examples**: enables quick testing on classic optimization and machine learning tasks.

The design of mlx-optmizers is largely inspired by [pytorch-optmizer](https://github.com/jettify/pytorch-optimizer/tree/master).

## Install

The reccomended way to install mlx-optimizers is to install the latest stable release through [PyPi](https://pypi.org/project/mlx-optimizers/):

```bash
pip install mlx-optimizers
```

To install mlx-optimizers from source, first clone [the repository](https://github.com/stockeh/mlx-optimizers.git):

```bash
git clone https://github.com/stockeh/mlx-optimizers.git
cd mlx-optimizers
```
Then run

```bash
pip install -e .
```

## Usage

There are a variety of optimizers to choose from (see [docs](https://stockeh.github.io/mlx-optimizers/build/html/optimizers.html)). Each of these inherit the [`mx.optimizers`](https://ml-explore.github.io/mlx/build/html/python/optimizers.html) class from MLX, so the core functionality remains the same. We can simply use the optimizer as follows:

```python
import mlx_optimizers as optim

#... model, grads, etc.
optimizer = optim.DiffGrad(learning_rate=0.001)
optimizer.update(model, grads)
```

## Examples

The [examples](examples) folder offers a non-exhaustive set of demonstrative use cases for mlx-optimizers. This includes classic optimization benchmarks on the Rosenbrock function and training a simple neural net classifier on MNIST.


logo
mnist
logo
mnist

## Contributing

Interested in adding a new optimizer? Start with verifying it is not already implemented or in development, then open a new [feature request](https://github.com/stockeh/mlx-optimizers/issues/new?assignees=&labels=&projects=&template=feature_request.md&title=)! If you spot a bug, please open a [bug report](https://github.com/stockeh/mlx-optimizers/issues/new?assignees=&labels=&projects=&template=bug_report.md&title=).

Developer? See our [contributing](.github/CONTRIBUTING.md) guide.