https://github.com/uranusx86/addernet-on-tensorflow
a implementation of neural network with almost no multiplication with tensorflow 2 (Adder Net)
https://github.com/uranusx86/addernet-on-tensorflow
adder addernet ai artificial-intelligence cnn custom-layer deep-learning deep-neural-networks image-classification implementation implementation-of-research-paper keras lightweight mnist neural-network tensorflow tensorflow2
Last synced: 2 months ago
JSON representation
a implementation of neural network with almost no multiplication with tensorflow 2 (Adder Net)
- Host: GitHub
- URL: https://github.com/uranusx86/addernet-on-tensorflow
- Owner: uranusx86
- License: mit
- Created: 2023-01-28T13:58:36.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-02-19T10:58:29.000Z (about 2 years ago)
- Last Synced: 2025-01-21T08:23:43.326Z (4 months ago)
- Topics: adder, addernet, ai, artificial-intelligence, cnn, custom-layer, deep-learning, deep-neural-networks, image-classification, implementation, implementation-of-research-paper, keras, lightweight, mnist, neural-network, tensorflow, tensorflow2
- Language: Python
- Homepage:
- Size: 8.79 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# AdderNet-on-tensorflow
a implementation of L1 Convolution and Dense layer with tensorflow 2 (Adder Net).this project is still under development.
# Requirement
* tensorflow 2
* numpy# Work log
At first, the training accuracy was 96% and the testing accuracy was 95.6% on the MNIST dataset.
[2023-02-03] add full-precision gradient, the training accuracy is 97.4% and the testing accuracy is 96.9% on the MNIST dataset.
[2023-02-04] add gradient clipping, the training accuracy is 97.1% and the testing accuracy is 96.4% on the MNIST dataset.
[2023-02-16] add cosine lr, the training accuracy is 99.97% and the testing accuracy is 98.87% on the MNIST dataset.
[2023-02-18] add adaptive gradient scaling, change the optimizer from adam to NAG and increase the epoches to 50, the training accuracy is 99.99% and the testing accuracy is 98.94% on the MNIST dataset.