{"id":15667894,"url":"https://github.com/rupurt/llm-http-api","last_synced_at":"2025-05-06T19:47:27.939Z","repository":{"id":212743471,"uuid":"732166280","full_name":"rupurt/llm-http-api","owner":"rupurt","description":"HTTP API for LLM with OpenAI compatibility","archived":false,"fork":false,"pushed_at":"2023-12-20T00:28:21.000Z","size":22,"stargazers_count":5,"open_issues_count":0,"forks_count":1,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-04-11T16:55:22.078Z","etag":null,"topics":["ai","llm","local-ai","openai","openai-api","python"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/rupurt.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-12-15T20:14:50.000Z","updated_at":"2024-10-02T09:54:19.000Z","dependencies_parsed_at":"2023-12-19T02:42:32.438Z","dependency_job_id":"8052c288-df5e-4cf3-a545-c2fd32bf093a","html_url":"https://github.com/rupurt/llm-http-api","commit_stats":null,"previous_names":["rupurt/llm-http-api"],"tags_count":2,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rupurt%2Fllm-http-api","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rupurt%2Fllm-http-api/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rupurt%2Fllm-http-api/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rupurt%2Fllm-http-api/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/rupurt","download_url":"https://codeload.github.com/rupurt/llm-http-api/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":252757774,"owners_count":21799729,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","llm","local-ai","openai","openai-api","python"],"created_at":"2024-10-03T14:05:46.017Z","updated_at":"2025-05-06T19:47:27.914Z","avatar_url":"https://github.com/rupurt.png","language":"Python","readme":"# llm-http-api\n\nHTTP API for [LLM](https://github.com/simonw/llm) with OpenAI compatibility\n\n## Install\n\n```shell\n\u003e pip install llm\n\u003e pip install llm-http-api\n```\n\n## Getting Started\n\n1. Follow the [directions](https://github.com/simonw/llm?tab=readme-ov-file#getting-started) from LLM\n2. Run the plugin `llm http-api`\n3. Visit the OpenAPI documentation [localhost:8080/docs](http://localhost:8080/docs)\n\n## Usage\n\n```shell\n\u003e llm http-api --help\nUsage: llm http-api [OPTIONS]\n\n  Run a FastAPI HTTP server with OpenAI compatibility\n\nOptions:\n  -h, --host TEXT         [default: 0.0.0.0]\n  -p, --port INTEGER      [default: 8080]\n  -l, --log-level TEXT    [default: info]\n  -r, --reload\n  -d, --reload-dirs LIST  [default: src]\n  --help                  Show this message and exit.\n```\n\n```shell\n\u003e curl http://localhost:8080/v1/embeddings -X POST -H \"Content-Type: application/json\" -d '{\n  \"input\": \"Hello world\",\n  \"model\": \"jina-embeddings-v2-small-en\"\n}'\n{\"object\":\"embedding\",\"embedding\":[-0.47561466693878174,-0.4471365511417389,...],\"index\":0}\n```\n\n## Supported OpenAI Endpoints\n\n### Models\n\n- [x] [`GET /v1/models`](./docs/endpoints/MODELS.md)\n- [x] [`GET /v1/models/{model}`](./docs/endpoints/MODELS.md)\n- [ ] [`DELETE /v1/models/{model}`](./docs/endpoints/MODELS.md)\n\n### Embeddings\n\n- [x] [`POST /v1/embeddings`](./docs/endpoints/EMBEDDINGS.md)\n\n### Chat\n\n- [ ] [`POST /v1/chat/completions`](./docs/endpoints/CHAT.md)\n\n## Unsupported OpenAI Endpoints\n\nA detailed list of unimplemented OpenAI endpoints can be found [here](./docs/endpoints/UNIMPLEMENTED.md)\n\n## Development\n\nThis repository manages the dev environment as a Nix flake and requires [Nix to be installed](https://github.com/DeterminateSystems/nix-installer)\n\n```shell\nnix develop -c $SHELL\n```\n\n```shell\nmake deps.install\nmake deps.install/test\n```\n\n```shell\nmake run/dev\n```\n\n```shell\nmake test\n```\n\n```shell\nmake coverage\n```\n\n```shell\nmake lint\n```\n\n```shell\nmake format\n```\n\n## Publish Package to PyPi\n\n```shell\nmake publish/pypi\n```\n\n## Local LLM's\n\n```shell\nmake llm.install/mlc\nmake llm.setup/mlc\nmake llm.mlc.download\nmake llm.mlc.download/code_llama-34b-python-q4f16\nmake llm.mlc.download/code_llama-34b-instruct-q0f16\nmake llm.mlc.download/code_llama-13b-q4f16\nmake llm.mlc.download/code_llama-7b-q4f16\nmake llm.mlc.download/wizard-coder-15b-q4f32\nmake llm.mlc.download/open_hermes-2.5-mistral-7b-q4f16\nmake llm.mlc.download/mistral-7b-instruct-q4f16\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frupurt%2Fllm-http-api","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Frupurt%2Fllm-http-api","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frupurt%2Fllm-http-api/lists"}