https://github.com/evanatyourservice/rectified-diffgrad-tf
Rectified Adam optimizer with diffGrad Optimizer modification
https://github.com/evanatyourservice/rectified-diffgrad-tf
Last synced: 4 months ago
JSON representation
Rectified Adam optimizer with diffGrad Optimizer modification
- Host: GitHub
- URL: https://github.com/evanatyourservice/rectified-diffgrad-tf
- Owner: evanatyourservice
- Created: 2020-01-02T03:18:29.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-01-04T01:24:20.000Z (over 5 years ago)
- Last Synced: 2025-01-27T06:44:41.800Z (6 months ago)
- Language: Python
- Homepage:
- Size: 2.93 KB
- Stars: 2
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Rectified-diffGrad-tf
This is a mix of the diffGrad optimizer from the paper [diffGrad: An Optimization Method for
Convolutional Neural Networks](https://arxiv.org/abs/1909.11015) and the rectified adam optimizer from the paper
[On the Variance of the Adaptive Learning Rate and Beyond](https://arxiv.org/abs/1908.03265) in Tensorflow.This uses version one of diffGrad from their paper. If you'd like version two, simply remove `tf.math.abs` from line 98 (making it
`1.0 / (1.0 + tf.math.exp(-(prev_g - grad)))`). I haven't implemented versions 3-5 but please feel free to contribute.