Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/robflynnyh/hydra-linear-attention

Implementation of: Hydra Attention: Efficient Attention with Many Heads (https://arxiv.org/abs/2209.07484)
https://github.com/robflynnyh/hydra-linear-attention

attention efficient-attention linear-attention machine-learning transformers

Last synced: 9 days ago
JSON representation

Implementation of: Hydra Attention: Efficient Attention with Many Heads (https://arxiv.org/abs/2209.07484)

Awesome Lists containing this project

README

        

# hydra-linear-attention
Implementation of the thingy described in this paper: https://arxiv.org/pdf/2209.07484.pdf

- code is mostly taken from the appendix of the paper its pretty simple
- basically its linear attention with heads equeal to the feature dim, they use l2 norm as the kernel fn rather than softmax as it allows you to scale the "head" dimension, which makes it faster
- idk if it's descriptive to say stuff like this is similar to regular attention - I see it being more similar to something like squeeze and excite layers