Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/turboderp/exllamav2
A fast inference library for running LLMs locally on modern consumer-class GPUs
https://github.com/turboderp/exllamav2
Last synced: about 2 months ago
JSON representation
A fast inference library for running LLMs locally on modern consumer-class GPUs
- Host: GitHub
- URL: https://github.com/turboderp/exllamav2
- Owner: turboderp
- License: mit
- Created: 2023-08-30T08:54:22.000Z (about 1 year ago)
- Default Branch: master
- Last Pushed: 2024-05-19T10:39:58.000Z (4 months ago)
- Last Synced: 2024-05-19T14:53:09.295Z (4 months ago)
- Language: Python
- Homepage:
- Size: 3.2 MB
- Stars: 3,022
- Watchers: 35
- Forks: 222
- Open Issues: 103
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
- awesome-local-llms - exllamav2 - class GPUs | 3,112 | 228 | 110 | 39 | 26 | MIT License | 1 days, 10 hrs, 49 mins | (Open-Source Local LLM Projects)
- awesome-local-ai - ExLlamaV2 - A fast inference library for running LLMs locally on modern consumer-class GPUs | GPTQ/EXL2 | GPU | ❌ | Python/C++ | Text-Gen | (Inference Engine)
- StarryDivineSky - turboderp/exllamav2