https://github.com/monk1337/various-attention-mechanisms
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
https://github.com/monk1337/various-attention-mechanisms
attention attention-lstm attention-mechanism attention-mechanisms attention-model attention-network bahdanau-attention hierarchical-attention keras luong-attention multi-head-attention pytorch scaled-dot-product-attention self-attention sentence-attention
Last synced: 2 months ago
JSON representation
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
- Host: GitHub
- URL: https://github.com/monk1337/various-attention-mechanisms
- Owner: monk1337
- Created: 2018-07-04T20:16:17.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2021-09-23T07:49:21.000Z (over 3 years ago)
- Last Synced: 2025-03-24T02:13:15.877Z (3 months ago)
- Topics: attention, attention-lstm, attention-mechanism, attention-mechanisms, attention-model, attention-network, bahdanau-attention, hierarchical-attention, keras, luong-attention, multi-head-attention, pytorch, scaled-dot-product-attention, self-attention, sentence-attention
- Language: Python
- Homepage:
- Size: 643 KB
- Stars: 125
- Watchers: 5
- Forks: 25
- Open Issues: 0