https://github.com/danieldk/attention-kernels
Attention kernels
https://github.com/danieldk/attention-kernels
Last synced: 10 months ago
JSON representation
Attention kernels
- Host: GitHub
- URL: https://github.com/danieldk/attention-kernels
- Owner: danieldk
- License: apache-2.0
- Created: 2024-11-15T10:31:40.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2025-01-29T13:30:50.000Z (11 months ago)
- Last Synced: 2025-01-29T14:30:24.521Z (11 months ago)
- Language: Cuda
- Size: 82 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## attention-kernels
`attention-kernels` is a standalone package with the paged attention and
cache reshape kernels from [vLLM](https://github.com/vllm-project/vllm), with modifications for TGI.