Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/titu1994/keras-global-context-networks
Keras implementation of Global Context Attention blocks
https://github.com/titu1994/keras-global-context-networks
global-context keras tensorflow
Last synced: 9 days ago
JSON representation
Keras implementation of Global Context Attention blocks
- Host: GitHub
- URL: https://github.com/titu1994/keras-global-context-networks
- Owner: titu1994
- License: mit
- Created: 2019-04-28T18:52:42.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2019-04-29T00:47:03.000Z (over 5 years ago)
- Last Synced: 2024-10-13T14:17:11.722Z (24 days ago)
- Topics: global-context, keras, tensorflow
- Language: Python
- Size: 351 KB
- Stars: 46
- Watchers: 7
- Forks: 12
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Keras Global Context Attention Blocks
Keras implementation of the Global Context block from the paper [GCNet: Non-local Networks Meet Squeeze-Excitation Networks and Beyond](https://arxiv.org/abs/1904.11492).
Supports Conv1D, Conv2D and Conv3D directly with no modifications.
# Usage
Import `global_context_block` from `gc.py` and provide it a tensor as input.
```python
from gc import global_context_blockip = Input(...)
x = ConvND(...)(ip)# apply Global Context
x = global_context_block(x, reduction_ratio=16, transform_activation='linear')
...
```# Parameters
There are just two parameters to manage :
```
- reduction_ratio: The ratio to scale the transform block.
- transform_activation: The activation function prior to addition of the input with the context.
The paper uses no activation, but `sigmoid` may do better.
```# Requirements
- Keras 2.2.4+
- Tensorflow (1.13+) or CNTK