"flash-attention-3" Awesome Lists
awesome-llm-inference
📚A curated list of Awesome LLM Inference Papers with Codes.
awesome-llm deepseek deepseek-r1 deepseek-v3 flash-attention flash-attention-3 flash-mla llm-inference minimax-01 mla
4,123 stars
287 forks
346 projects
Last updated: 16 Jun 2025
Awesome-LLM-Inference
📚A curated list of Awesome LLM/VLM Inference Papers with codes: WINT8/4, FlashAttention, PagedAttention, MLA, Parallelism etc.
awesome-llm deepseek deepseek-r1 deepseek-v3 flash-attention flash-attention-3 flash-mla llm-inference minimax-01 mla
3,900 stars
275 forks
345 projects
Last updated: 26 Apr 2025