Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/titu1994/keras-attention-augmented-convs
Keras implementation of Attention Augmented Convolutional Neural Networks
https://github.com/titu1994/keras-attention-augmented-convs
attention-augmented-conv keras-tensorflow tensorflow
Last synced: 18 days ago
JSON representation
Keras implementation of Attention Augmented Convolutional Neural Networks
- Host: GitHub
- URL: https://github.com/titu1994/keras-attention-augmented-convs
- Owner: titu1994
- License: mit
- Created: 2019-04-28T16:37:35.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-03-06T16:03:01.000Z (over 4 years ago)
- Last Synced: 2024-10-13T14:17:04.277Z (about 1 month ago)
- Topics: attention-augmented-conv, keras-tensorflow, tensorflow
- Language: Python
- Size: 250 KB
- Stars: 120
- Watchers: 6
- Forks: 38
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Keras Attention Augmented Convolutions
A Keras (Tensorflow only) wrapper over the Attention Augmentation module from the paper [Attention Augmented Convolutional Networks](https://arxiv.org/abs/1904.09925).
Provides a Layer for Attention Augmentation as well as a callable function to build a augmented convolution block.
# Usage
It is advisable to use the `augmented_conv2d(...)` function directly to build an attention augmented convolution block.
```python
from attn_augconv import augmented_conv2dip = Input(...)
x = augmented_conv2d(ip, ...)
...
```If you wish to add the attention module seperately, you can do so using the `AttentionAugmentation1D` layer as well.
```python
from attn_augconv import AttentionAugmentation1Dip = Input(...)
# make sure that input to the AttentionAugmentation1D layer has (2 * depth_k + depth_v) filters.
x = Conv2D(2 * depth_k + depth_v, ...)(ip)
x = AttentionAugmentation1D(depth_k, depth_v, num_heads)(x)
...
```# Requirements
- Tensorflow 2.0+