Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/InternLM/lmdeploy
LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
https://github.com/InternLM/lmdeploy
codellama cuda-kernels deepspeed fastertransformer internlm llama llama2 llama3 llm llm-inference turbomind
Last synced: about 1 month ago
JSON representation
LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
- Host: GitHub
- URL: https://github.com/InternLM/lmdeploy
- Owner: InternLM
- License: apache-2.0
- Created: 2023-06-15T12:38:06.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-07-29T13:04:58.000Z (about 1 month ago)
- Last Synced: 2024-07-30T02:04:01.335Z (about 1 month ago)
- Topics: codellama, cuda-kernels, deepspeed, fastertransformer, internlm, llama, llama2, llama3, llm, llm-inference, turbomind
- Language: Python
- Homepage: https://lmdeploy.readthedocs.io/en/latest/
- Size: 4.44 MB
- Stars: 3,571
- Watchers: 32
- Forks: 319
- Open Issues: 277
-
Metadata Files:
- Readme: README.md
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
- awesome-local-llms - lmdeploy
- awesome-local-ai - LmDeploy - LMDeploy is a toolkit for compressing, deploying, and serving LLMs. | Pytorch / Turbomind | Both | ❌ | Python/C++ | Text-Gen | (Inference Engine)
- StarryDivineSky - InternLM/lmdeploy
- Awesome-LLM - LMDeploy - A high-throughput and low-latency inference and serving framework for LLMs and VLs (LLM Deployment)
- awesome-LLM-resourses - LMDeploy