https://github.com/muntahashams/attention-basics
In this notebook, we look at how attention is implemented. We will focus on implementing attention in isolation from a larger mode
https://github.com/muntahashams/attention-basics
attention machine-learning
Last synced: 8 months ago
JSON representation
In this notebook, we look at how attention is implemented. We will focus on implementing attention in isolation from a larger mode
- Host: GitHub
- URL: https://github.com/muntahashams/attention-basics
- Owner: MuntahaShams
- Created: 2020-04-19T17:40:39.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-04-19T17:42:09.000Z (over 5 years ago)
- Last Synced: 2025-01-13T11:30:09.393Z (9 months ago)
- Topics: attention, machine-learning
- Language: Jupyter Notebook
- Homepage:
- Size: 115 KB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Attention-Basics
In this notebook, we look at how attention is implemented. We will focus on implementing attention in isolation from a larger model. That's because when implementing attention in a real-world model, a lot of the focus goes into piping the data and juggling the various vectors rather than the concepts of attention themselves.We will implement attention scoring as well as calculating an attention context vector.