Projects in Awesome Lists tagged with dot-product-attention
A curated list of projects in awesome lists tagged with dot-product-attention .
https://github.com/sooftware/attentions
PyTorch implementation of some attentions for Deep Learning Researchers.
additive-attention attention dot-product-attention location-aware-attention location-sensitive-attension multi-head-attention pytorch relative-multi-head-attention relative-positional-encoding
Last synced: 05 Apr 2025
https://github.com/mtanghu/leap
LEAP: Linear Explainable Attention in Parallel for causal language modeling with O(1) path length, and O(1) inference
additive-attention attention-mechanism deep-learning dot-product-attention linear-attention local-attention parallel pytorch rnn softmax transformer transformers
Last synced: 20 Mar 2025
https://github.com/andreimoraru123/neural-machine-translation
Modern Eager TensorFlow implementation of Attention Is All You Need
attention beam-search bleu-score byte-pair-encoding deep-learning dot-product-attention einops embedding-projector embeddings encoder-decoder keras label-smoothing language language-model nlp self-attention tensorflow tokenization transformers translation
Last synced: 18 Jan 2025
https://github.com/dcarpintero/transformer101
Annotated vanilla implementation in PyTorch of the Transformer model introduced in 'Attention Is All You Need'.
attention-is-all-you-need dot-product-attention dropout-layers encoder-decoder-architecture feedforward-neural-network gelu linear-layers multihead-attention normalization-layers positional-encoding pytorch self-attention softmax transfomer
Last synced: 14 Mar 2025