Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/sooftware/pytorch-lr-scheduler
PyTorch implementation of some learning rate schedulers for deep learning researcher.
https://github.com/sooftware/pytorch-lr-scheduler
learning-rate lr plateau reduce scheduler transformer tri-stage warmup
Last synced: about 2 months ago
JSON representation
PyTorch implementation of some learning rate schedulers for deep learning researcher.
- Host: GitHub
- URL: https://github.com/sooftware/pytorch-lr-scheduler
- Owner: sooftware
- License: mit
- Created: 2021-05-20T17:19:27.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2022-09-06T00:07:36.000Z (over 2 years ago)
- Last Synced: 2024-05-02T02:50:25.169Z (9 months ago)
- Topics: learning-rate, lr, plateau, reduce, scheduler, transformer, tri-stage, warmup
- Language: Python
- Homepage:
- Size: 123 KB
- Stars: 84
- Watchers: 2
- Forks: 17
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# pytorch-lr-scheduler
PyTorch implementation of some learning rate schedulers for deep learning researcher.
## Usage
### [`WarmupReduceLROnPlateauScheduler`](https://github.com/sooftware/pytorch-lr-scheduler/blob/main/lr_scheduler/warmup_reduce_lr_on_plateau_scheduler.py)
- Visualize
- Example code
```python
import torchfrom lr_scheduler.warmup_reduce_lr_on_plateau_scheduler import WarmupReduceLROnPlateauScheduler
if __name__ == '__main__':
max_epochs, steps_in_epoch = 10, 10000model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = torch.optim.Adam(model, 1e-10)scheduler = WarmupReduceLROnPlateauScheduler(
optimizer,
init_lr=1e-10,
peak_lr=1e-4,
warmup_steps=30000,
patience=1,
factor=0.3,
)for epoch in range(max_epochs):
for timestep in range(steps_in_epoch):
...
...
if timestep < warmup_steps:
scheduler.step()
val_loss = validate()
scheduler.step(val_loss)
```
### [`TransformerLRScheduler`](https://github.com/sooftware/pytorch-lr-scheduler/blob/main/lr_scheduler/transformer_lr_scheduler.py)
- Visualize
- Example code
```python
import torchfrom lr_scheduler.transformer_lr_scheduler import TransformerLRScheduler
if __name__ == '__main__':
max_epochs, steps_in_epoch = 10, 10000model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = torch.optim.Adam(model, 1e-10)scheduler = TransformerLRScheduler(
optimizer=optimizer,
init_lr=1e-10,
peak_lr=0.1,
final_lr=1e-4,
final_lr_scale=0.05,
warmup_steps=3000,
decay_steps=17000,
)for epoch in range(max_epochs):
for timestep in range(steps_in_epoch):
...
...
scheduler.step()
```
### [`TriStageLRScheduler`](https://github.com/sooftware/pytorch-lr-scheduler/blob/main/lr_scheduler/tri_stage_lr_scheduler.py)
- Visualize
- Example code
```python
import torchfrom lr_scheduler.tri_stage_lr_scheduler import TriStageLRScheduler
if __name__ == '__main__':
max_epochs, steps_in_epoch = 10, 10000model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = torch.optim.Adam(model, 1e-10)scheduler = TriStageLRScheduler(
optimizer,
init_lr=1e-10,
peak_lr=1e-4,
final_lr=1e-7,
init_lr_scale=0.01,
final_lr_scale=0.05,
warmup_steps=30000,
hold_steps=70000,
decay_steps=100000,
total_steps=200000,
)for epoch in range(max_epochs):
for timestep in range(steps_in_epoch):
...
...
scheduler.step()
```
### [`ReduceLROnPlateauScheduler`](https://github.com/sooftware/pytorch-lr-scheduler/blob/main/lr_scheduler/reduce_lr_on_plateau_lr_scheduler.py)
- Visualize
- Example code
```python
import torchfrom lr_scheduler.reduce_lr_on_plateau_lr_scheduler import ReduceLROnPlateauScheduler
if __name__ == '__main__':
max_epochs, steps_in_epoch = 10, 10000model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = torch.optim.Adam(model, 1e-4)scheduler = ReduceLROnPlateauScheduler(optimizer, patience=1, factor=0.3)
for epoch in range(max_epochs):
for timestep in range(steps_in_epoch):
...
...
val_loss = validate()
scheduler.step(val_loss)
```
### [`WarmupLRScheduler`](https://github.com/sooftware/pytorch-lr-scheduler/blob/main/lr_scheduler/warmup_lr_scheduler.py)
- Visualize
- Example code
```python
import torchfrom lr_scheduler.warmup_lr_scheduler import WarmupLRScheduler
if __name__ == '__main__':
max_epochs, steps_in_epoch = 10, 10000model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = torch.optim.Adam(model, 1e-10)scheduler = WarmupLRScheduler(
optimizer,
init_lr=1e-10,
peak_lr=1e-4,
warmup_steps=4000,
)for epoch in range(max_epochs):
for timestep in range(steps_in_epoch):
...
...
scheduler.step()
```
## Installation
```bash
git clone [email protected]:sooftware/pytorch-lr-scheduler.git
cd pytorch-lr-scheduler
pip install .
```## Troubleshoots and Contributing
If you have any questions, bug reports, and feature requests, please [open an issue](https://github.com/sooftware/pytorch-lr-scheduler/issues) on Github.
I appreciate any kind of feedback or contribution. Feel free to proceed with small issues like bug fixes, documentation improvement. For major contributions and new features, please discuss with the collaborators in corresponding issues.
## Code Style
I follow [PEP-8](https://www.python.org/dev/peps/pep-0008/) for code style. Especially the style of docstrings is important to generate documentation.
## License
This project is licensed under the MIT LICENSE - see the [LICENSE.md](https://github.com/sooftware/pytorch-lr-scheduler/blob/master/LICENSE) file for details