https://github.com/hazyresearch/flash-attention
Fast and memory-efficient exact attention
https://github.com/hazyresearch/flash-attention
Last synced: 4 months ago
JSON representation
Fast and memory-efficient exact attention
- Host: GitHub
- URL: https://github.com/hazyresearch/flash-attention
- Owner: Dao-AILab
- License: bsd-3-clause
- Created: 2022-05-19T21:22:06.000Z (almost 3 years ago)
- Default Branch: main
- Last Pushed: 2024-07-01T05:40:59.000Z (10 months ago)
- Last Synced: 2024-07-01T17:24:57.693Z (10 months ago)
- Language: Python
- Homepage:
- Size: 7.08 MB
- Stars: 11,791
- Watchers: 103
- Forks: 1,044
- Open Issues: 441
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Authors: AUTHORS
Awesome Lists containing this project
- awesome-ChatGPT-repositories - flash-attention - Fast and memory-efficient exact attention (Others)