Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/cmpark0126/pytorch-polynomial-lr-decay
Polynomial Learning Rate Decay Scheduler for PyTorch
https://github.com/cmpark0126/pytorch-polynomial-lr-decay
deep-learning learning-rate-decay learning-rate-scheduling pytorch pytorch-extension
Last synced: about 2 months ago
JSON representation
Polynomial Learning Rate Decay Scheduler for PyTorch
- Host: GitHub
- URL: https://github.com/cmpark0126/pytorch-polynomial-lr-decay
- Owner: cmpark0126
- License: mit
- Created: 2019-07-18T04:21:42.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2021-12-25T12:34:59.000Z (about 3 years ago)
- Last Synced: 2023-03-03T04:17:35.717Z (almost 2 years ago)
- Topics: deep-learning, learning-rate-decay, learning-rate-scheduling, pytorch, pytorch-extension
- Language: Python
- Size: 10.7 KB
- Stars: 53
- Watchers: 0
- Forks: 12
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# pytorch-polynomial-lr-decay
Polynomial Learning Rate Decay Scheduler for PyTorchThis scheduler is frequently used in many DL paper. But there is no official implementation in PyTorch. So I propose this code.
## Install
```
$ pip install git+https://github.com/cmpark0126/pytorch-polynomial-lr-decay.git
```## Usage
```python
from torch_poly_lr_decay import PolynomialLRDecayscheduler_poly_lr_decay = PolynomialLRDecay(optim, max_decay_steps=100, end_learning_rate=0.0001, power=2.0)
for epoch in range(train_epoch):
scheduler_poly_lr_decay.step() # you can handle step as epoch number
...
```or
```python
from torch_poly_lr_decay import PolynomialLRDecayscheduler_poly_lr_decay = PolynomialLRDecay(optim, max_decay_steps=100, end_learning_rate=0.0001, power=2.0)
...
for batch_idx, (inputs, targets) in enumerate(trainloader):
scheduler_poly_lr_decay.step() # also, you can handle step as each iter number
```