https://github.com/facebookresearch/optimizers
For optimization algorithm research and development.
https://github.com/facebookresearch/optimizers
Last synced: about 2 months ago
JSON representation
For optimization algorithm research and development.
- Host: GitHub
- URL: https://github.com/facebookresearch/optimizers
- Owner: facebookresearch
- License: other
- Created: 2022-04-04T23:08:20.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2025-03-31T15:54:57.000Z (3 months ago)
- Last Synced: 2025-03-31T20:05:31.086Z (3 months ago)
- Language: Python
- Homepage:
- Size: 634 KB
- Stars: 502
- Watchers: 17
- Forks: 39
- Open Issues: 10
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# Optimizers
[](https://www.python.org/downloads/)
[](https://github.com/facebookresearch/optimizers/actions/workflows/tests.yaml?query=branch%3Amain)
[](https://github.com/facebookresearch/optimizers/actions/workflows/gpu-tests.yaml?query=branch%3Amain)
[](https://github.com/facebookresearch/optimizers/actions/workflows/pre-commit.yaml?query=branch%3Amain)
[](https://github.com/facebookresearch/optimizers/actions/workflows/type-check.yaml?query=branch%3Amain)
[](https://github.com/facebookresearch/optimizers/actions/workflows/examples.yaml?query=branch%3Amain)
[](./LICENSE)*Copyright (c) Meta Platforms, Inc. and affiliates.
All rights reserved.*## Description
Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.Currently includes the optimizers:
- Distributed ShampooSee the [CONTRIBUTING](CONTRIBUTING.md) file for how to help out.
## License
Optimizers is released under the [BSD license](LICENSE).## Installation and Dependencies
This code requires `python>=3.12` and `torch>=2.7.0`.
Install `distributed_shampoo` with all dependencies:
```bash
git clone [email protected]:facebookresearch/optimizers.git
cd optimizers
pip install .
```
If you also want to try the [examples](./distributed_shampoo/examples/), replace the last line with `pip install ".[examples]"`.## Usage
After installation, basic usage looks like:
```python
import torch
from distributed_shampoo import AdamGraftingConfig, DistributedShampoomodel = ... # Instantiate model
optim = DistributedShampoo(
model.parameters(),
lr=1e-3,
betas=(0.9, 0.999),
epsilon=1e-8,
grafting_config=AdamGraftingConfig(
beta2=0.999,
epsilon=1e-8,
),
)
```For more, please see the [additional documentation here](./distributed_shampoo/README.md) and especially the [How to Use](./distributed_shampoo/README.md#how-to-use) section.