Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/digantamisra98/evonorm
Unofficial PyTorch Implementation of EvoNorm
https://github.com/digantamisra98/evonorm
computer-vision deep-learning nas neural-architecture-search neural-networks normalization
Last synced: about 2 months ago
JSON representation
Unofficial PyTorch Implementation of EvoNorm
- Host: GitHub
- URL: https://github.com/digantamisra98/evonorm
- Owner: digantamisra98
- License: mit
- Created: 2020-04-08T09:21:35.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2021-08-29T16:48:01.000Z (over 3 years ago)
- Last Synced: 2024-12-24T13:42:11.637Z (about 2 months ago)
- Topics: computer-vision, deep-learning, nas, neural-architecture-search, neural-networks, normalization
- Language: Python
- Homepage: https://arxiv.org/abs/2004.02967
- Size: 381 KB
- Stars: 121
- Watchers: 5
- Forks: 17
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Evolving Normalization-Activation Layers
*Google AI and DeepMind*
- [x] Implement EvoNorm S0 and B0 with Training Mode support
- [x] Solve Shape Error with group_std and instance_std functions
- [x] Solve NaN Error Issue with S0
- [x] Fix Error with shape in running variance calculation in EvoNorm B0
- [x] Solve NaN Error Issue with B0
Figure 1. Left: Computation graph of a searched normalization activation layer that is batch-independent, named EvoNorm-S0. Right: ResNet-50 results with EvoNorm-S0 as the batch size over 8 workers varies from 1024 to 32 on ImageNet. EvoNorm-S0 also outperforms both BN and GN-based layers on MobileNetV2 and Mask R-CNN.
## Usage:
```
from evonorm2d import EvoNorm2D
# For B0 version
evoB0 = EvoNorm2D(input, affine = True, version = 'B0', training = True)# For S0 version
evoS0 = EvoNorm2D(input)
```