Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/perpendicularai/apiinashell
An API deployed using fastapi running LlamaCpp as the backend to do LLM inference.
https://github.com/perpendicularai/apiinashell
fastapi generative-ai gguf llamacpp
Last synced: 7 days ago
JSON representation
An API deployed using fastapi running LlamaCpp as the backend to do LLM inference.
- Host: GitHub
- URL: https://github.com/perpendicularai/apiinashell
- Owner: perpendicularai
- License: mit
- Created: 2024-05-17T15:21:24.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2024-07-19T07:30:45.000Z (3 months ago)
- Last Synced: 2024-09-27T04:20:10.896Z (7 days ago)
- Topics: fastapi, generative-ai, gguf, llamacpp
- Language: Python
- Homepage:
- Size: 45.9 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# π APIinaShell :octocat:
An API deployed using fastapi running LlamaCpp as the backend to do LLM inference. The purpose of the script is to abstract the complexity of deploying a LlamaCpp API instance on a global scale.
## π¦ Dependencies
- llama_cpp_python
- fastapi
- requests
- uvicornInstall with `pip`.
## πΊοΈ How to
To start using the api, you need to ensure that LlamaCpp is installed.
- Download a fresh GGUF converted model from [Huggingface](https://huggingface.co/models?sort=trending&search=gguf) and provide the path to it in the script.
- In a command prompt, run:
```
uvicorn apiinashell:app --reload --host --port # This should start the api server at the configured address.
```
- Open a browser and browse to the address displayed in the command prompt by appending π° /docs. You should see the :dependabot: api interface.
- You can then enter a string to test it out. See video :## π₯ Short Films
https://github.com/perpendicularai/APIinaShell/assets/146530480/87491a67-4691-4574-90ae-ed55d4126b58