https://github.com/iceychris/sps-pytorch
:eight_spoked_asterisk: An unofficial implementation of Stochastic Polyak Step-size in PyTorch
https://github.com/iceychris/sps-pytorch
optimizer pytorch reproduction sota sps
Last synced: 8 months ago
JSON representation
:eight_spoked_asterisk: An unofficial implementation of Stochastic Polyak Step-size in PyTorch
- Host: GitHub
- URL: https://github.com/iceychris/sps-pytorch
- Owner: iceychris
- License: mit
- Created: 2020-04-16T12:06:12.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-04-16T14:39:30.000Z (over 5 years ago)
- Last Synced: 2024-05-22T03:25:42.576Z (over 1 year ago)
- Topics: optimizer, pytorch, reproduction, sota, sps
- Language: Python
- Homepage: https://arxiv.org/abs/2002.10542
- Size: 169 KB
- Stars: 2
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# sps-pytorch
[Please find the official implementation of the authors here](https://github.com/IssamLaradji/sps).
This repo aims to reproduce and extend the `SPS` optimizer by `Nicolas Loizou`, `Sharan Vaswani`, `Issam Laradji` and `Simon Lacoste-Julien`
presented in their paper titled [`Stochastic Polyak Step-size for SGD: An Adaptive Learning Rate for Fast Convergence`](https://arxiv.org/abs/2002.10542).## Results
| RNN-T |
|:-----:|
||
||
||
||## Run
- clone and patch fastai
```bash
# clone
git clone https://github.com/fastai/fastai
cd fastai
git checkout 1.0.60
pip install -e ".[dev]"# patch
git apply fastai.patch
```- `python train.py --opt sps --epochs 5`
## Resources
- [official Implementation of `SPS`](https://github.com/IssamLaradji/sps)
- [mgrankin/over9000](https://github.com/mgrankin/over9000) (code snippets)