https://github.com/kozistr/pytorch_optimizer
optimizer & lr scheduler & loss function collections in PyTorch
https://github.com/kozistr/pytorch_optimizer
adabelief adabound adai adamd adamp adan ademamix deep-learning diffgrad gradient-centralization learning-rate-scheduling lookahead loss-functions madgrad nero optimizer pytorch radam ranger sam
Last synced: 23 days ago
JSON representation
optimizer & lr scheduler & loss function collections in PyTorch
- Host: GitHub
- URL: https://github.com/kozistr/pytorch_optimizer
- Owner: kozistr
- License: apache-2.0
- Created: 2021-09-21T05:52:24.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2025-03-24T06:27:03.000Z (about 1 month ago)
- Last Synced: 2025-03-24T07:25:36.622Z (about 1 month ago)
- Topics: adabelief, adabound, adai, adamd, adamp, adan, ademamix, deep-learning, diffgrad, gradient-centralization, learning-rate-scheduling, lookahead, loss-functions, madgrad, nero, optimizer, pytorch, radam, ranger, sam
- Language: Python
- Homepage: https://pytorch-optimizers.readthedocs.io/en/latest/
- Size: 285 MB
- Stars: 280
- Watchers: 6
- Forks: 24
- Open Issues: 8
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Citation: CITATION.cff
- Codeowners: .github/CODEOWNERS
- Security: SECURITY.md