Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/atanikan/llm-inference-service
This repo hosts the different ways to run vllm on ANL HPC system
https://github.com/atanikan/llm-inference-service
Last synced: 11 days ago
JSON representation
This repo hosts the different ways to run vllm on ANL HPC system
- Host: GitHub
- URL: https://github.com/atanikan/llm-inference-service
- Owner: atanikan
- License: gpl-3.0
- Created: 2023-09-20T23:57:29.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-05-22T19:08:11.000Z (6 months ago)
- Last Synced: 2024-05-22T20:00:10.386Z (6 months ago)
- Language: Jupyter Notebook
- Size: 1.26 MB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# LLM Inference Service
This repository provides the installation and usage of various LLM Inference frameworks on Polaris. The frameworks currently tested are:
* [llama.cpp](llama-cpp/polaris/README.md)
* [vLLM](vllm/polaris/README.md)
* [ollama](ollama/polaris/README.md)
* [deepspeed-inference](deepspeed-inference-mii)