{"id":14964493,"url":"https://github.com/av/harbor","last_synced_at":"2026-03-13T19:10:57.742Z","repository":{"id":250497903,"uuid":"834595923","full_name":"av/harbor","owner":"av","description":"One command brings a complete pre-wired LLM stack with hundreds of services to explore.","archived":false,"fork":false,"pushed_at":"2026-03-08T14:30:46.000Z","size":44610,"stargazers_count":2484,"open_issues_count":59,"forks_count":166,"subscribers_count":19,"default_branch":"main","last_synced_at":"2026-03-08T14:44:11.994Z","etag":null,"topics":["ai","automation","bash","cli","container","docker","docker-compose","homelab","llm","local","mcp","npm","package","pypi","safetensors","self-hosted","server","tool","tools"],"latest_commit_sha":null,"homepage":"https://github.com/av/harbor","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/av.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":"AGENTS.md","dco":null,"cla":null},"funding":{"github":null,"patreon":null,"open_collective":null,"ko_fi":"harbor","tidelift":null,"community_bridge":null,"liberapay":null,"issuehunt":null,"lfx_crowdfunding":null,"polar":null,"buy_me_a_coffee":null,"thanks_dev":null,"custom":null}},"created_at":"2024-07-27T18:46:59.000Z","updated_at":"2026-03-08T14:30:50.000Z","dependencies_parsed_at":"2024-09-12T06:11:30.414Z","dependency_job_id":"211c9d00-6830-4e65-a66d-f432bfbe05d2","html_url":"https://github.com/av/harbor","commit_stats":{"total_commits":355,"total_committers":4,"mean_commits":88.75,"dds":"0.028169014084507005","last_synced_commit":"c43080d24ab5791b3642da18b478f66cca2c6838"},"previous_names":["av/harbor"],"tags_count":127,"template":false,"template_full_name":null,"purl":"pkg:github/av/harbor","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/av%2Fharbor","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/av%2Fharbor/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/av%2Fharbor/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/av%2Fharbor/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/av","download_url":"https://codeload.github.com/av/harbor/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/av%2Fharbor/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":30472992,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-13T17:15:31.527Z","status":"ssl_error","status_checked_at":"2026-03-13T17:15:22.394Z","response_time":60,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","automation","bash","cli","container","docker","docker-compose","homelab","llm","local","mcp","npm","package","pypi","safetensors","self-hosted","server","tool","tools"],"created_at":"2024-09-24T13:33:15.884Z","updated_at":"2026-03-13T19:10:57.732Z","avatar_url":"https://github.com/av.png","language":"TypeScript","readme":"\n\nhttps://github.com/user-attachments/assets/8a7705e1-6f0e-4374-8784-62b95816aebc\n\n\n\n[![GitHub Tag](https://img.shields.io/github/v/tag/av/harbor)](https://github.com/av/harbor/releases)\n[![NPM Version](https://img.shields.io/npm/v/%40avcodes%2Fharbor?labelColor=red\u0026color=white)](https://www.npmjs.com/package/@avcodes/harbor)\n[![PyPI - Version](https://img.shields.io/pypi/v/llm-harbor?labelColor=blue)](https://pypi.org/project/llm-harbor/)\n![GitHub repo size](https://img.shields.io/github/repo-size/av/harbor)\n![GitHub repo file or directory count](https://img.shields.io/github/directory-file-count/av/harbor?type=file\u0026extension=yml\u0026label=compose%20files\u0026color=orange)\n![GitHub language count](https://img.shields.io/github/languages/count/av/harbor)\n[![Visitors](https://api.visitorbadge.io/api/visitors?path=av%2Fharbor\u0026countColor=%23263759\u0026style=flat)](https://visitorbadge.io/status?path=av%2Fharbor)\n[![Discord](https://img.shields.io/badge/Discord-Harbor-blue?logo=discord\u0026logoColor=white)](https://discord.gg/8nDRphrhSF)\n![Harbor Ko-fi](https://img.shields.io/badge/Ko--fi-white?style=social\u0026logo=kofi)\n\n[![ask](https://img.shields.io/badge/ask-claude-252421?style=flat\u0026labelColor=555)](https://textclip.sh?ask=claude#c=TZTNjiM1EIDveQofga2N4LJCkQZpZ2AAaVZCzBNU29XdtSm7PLY7P2QjjYQUshyjleC6J8S8Bnd4h34S5E4GcbH6p7r8lb-q_g5To2l8__Hm7vvx-PQ6xsXXapeUbtRHzTQeTqIW5e7ujRni-OvPa2oGfqEi6PEFpm41m_39YfETJX1pNbTcjcfHNScOnfnCejceTu0gkgvapcGhTFGBbBmPj7truIX7vYlJWxa6X2M09La-O5xyweBQNJApQwgk4-HEoVAKVMxDGg8nrw0LmfH4p0Xbk_nn8ZOK86npORfz12_2XIB5dz0en27H49P9u69efT6bXV_tzvQwrTZGWIl4KB1DwabZYmTA2Cd1XAg855JQUobcCYYOliVhyK0mTymDeEdRdAvIqWYJtEFYaqPiwKsbBJPHDeRIFTJDxCSU9rdXu-kgQbhJZHssUJf6QJvzfX-hrF9MFw1rYNvFAhi2pefQ1Q1rbGAFz8shogMtlDxY9e124P391S4Tpk3oIFKKQhu2CI7bLUTB4GgDwoVqolpcO2QCjRwK2NaB9a6HFpvEFpAdJUCe2KrICiKeVijwdojbQgk0UugxuFyTyrawzZAoakS7hPBlgEalQORIwoEyPLiEocDyFcSkPpZWFdbUFI2gPnDElClBK7rmTGdA0TVoLFyJvabYs4X8IBNVF4snKAmp5SUIFgolW40ECbta5tl4XmM8n_tkssaBp4LeRvA2KohLMDU9MmBHodTuroLXhCsCp1Y4dNAkXWdK9ciGkEVLD2sOzrMINBRsD41qLvvZbFI9_vLh2lxkjIfTxf_F-nMp_5MkLu0XP77-1jz3zmKlbMlc1C7Yd-aCPR7_qOCLNzc_jIfTlHg22w0RnK4DxCohF0wFRLsMtCELuaeKObC4SRsMSeAhwXnY9tmch3m-E-gg75er5ymd7zIMIHuT5zuvjgQwdXlv-na-azk4cALZYtibc__OP_tvprdezsN5_i-Mx99ns282UZCDKT3nukUNNUWNp_m_)\n[![ask](https://img.shields.io/badge/ask-chatgpt-252421?style=flat\u0026labelColor=555)](https://textclip.sh?ask=chatgpt#c=TZTNjiM1EIDveQofga2N4LJCkQZpZ2AAaVZCzBNU29XdtSm7PLY7P2QjjYQUshyjleC6J8S8Bnd4h34S5E4GcbH6p7r8lb-q_g5To2l8__Hm7vvx-PQ6xsXXapeUbtRHzTQeTqIW5e7ujRni-OvPa2oGfqEi6PEFpm41m_39YfETJX1pNbTcjcfHNScOnfnCejceTu0gkgvapcGhTFGBbBmPj7truIX7vYlJWxa6X2M09La-O5xyweBQNJApQwgk4-HEoVAKVMxDGg8nrw0LmfH4p0Xbk_nn8ZOK86npORfz12_2XIB5dz0en27H49P9u69efT6bXV_tzvQwrTZGWIl4KB1DwabZYmTA2Cd1XAg855JQUobcCYYOliVhyK0mTymDeEdRdAvIqWYJtEFYaqPiwKsbBJPHDeRIFTJDxCSU9rdXu-kgQbhJZHssUJf6QJvzfX-hrF9MFw1rYNvFAhi2pefQ1Q1rbGAFz8shogMtlDxY9e124P391S4Tpk3oIFKKQhu2CI7bLUTB4GgDwoVqolpcO2QCjRwK2NaB9a6HFpvEFpAdJUCe2KrICiKeVijwdojbQgk0UugxuFyTyrawzZAoakS7hPBlgEalQORIwoEyPLiEocDyFcSkPpZWFdbUFI2gPnDElClBK7rmTGdA0TVoLFyJvabYs4X8IBNVF4snKAmp5SUIFgolW40ECbta5tl4XmM8n_tkssaBp4LeRvA2KohLMDU9MmBHodTuroLXhCsCp1Y4dNAkXWdK9ciGkEVLD2sOzrMINBRsD41qLvvZbFI9_vLh2lxkjIfTxf_F-nMp_5MkLu0XP77-1jz3zmKlbMlc1C7Yd-aCPR7_qOCLNzc_jIfTlHg22w0RnK4DxCohF0wFRLsMtCELuaeKObC4SRsMSeAhwXnY9tmch3m-E-gg75er5ymd7zIMIHuT5zuvjgQwdXlv-na-azk4cALZYtibc__OP_tvprdezsN5_i-Mx99ns282UZCDKT3nukUNNUWNp_m_)\n[![ask](https://img.shields.io/badge/ask-perplexity-252421?style=flat\u0026labelColor=555)](https://textclip.sh?ask=perplexity#c=TZTNjiM1EIDveQofga2N4LJCkQZpZ2AAaVZCzBNU29XdtSm7PLY7P2QjjYQUshyjleC6J8S8Bnd4h34S5E4GcbH6p7r8lb-q_g5To2l8__Hm7vvx-PQ6xsXXapeUbtRHzTQeTqIW5e7ujRni-OvPa2oGfqEi6PEFpm41m_39YfETJX1pNbTcjcfHNScOnfnCejceTu0gkgvapcGhTFGBbBmPj7truIX7vYlJWxa6X2M09La-O5xyweBQNJApQwgk4-HEoVAKVMxDGg8nrw0LmfH4p0Xbk_nn8ZOK86npORfz12_2XIB5dz0en27H49P9u69efT6bXV_tzvQwrTZGWIl4KB1DwabZYmTA2Cd1XAg855JQUobcCYYOliVhyK0mTymDeEdRdAvIqWYJtEFYaqPiwKsbBJPHDeRIFTJDxCSU9rdXu-kgQbhJZHssUJf6QJvzfX-hrF9MFw1rYNvFAhi2pefQ1Q1rbGAFz8shogMtlDxY9e124P391S4Tpk3oIFKKQhu2CI7bLUTB4GgDwoVqolpcO2QCjRwK2NaB9a6HFpvEFpAdJUCe2KrICiKeVijwdojbQgk0UugxuFyTyrawzZAoakS7hPBlgEalQORIwoEyPLiEocDyFcSkPpZWFdbUFI2gPnDElClBK7rmTGdA0TVoLFyJvabYs4X8IBNVF4snKAmp5SUIFgolW40ECbta5tl4XmM8n_tkssaBp4LeRvA2KohLMDU9MmBHodTuroLXhCsCp1Y4dNAkXWdK9ciGkEVLD2sOzrMINBRsD41qLvvZbFI9_vLh2lxkjIfTxf_F-nMp_5MkLu0XP77-1jz3zmKlbMlc1C7Yd-aCPR7_qOCLNzc_jIfTlHg22w0RnK4DxCohF0wFRLsMtCELuaeKObC4SRsMSeAhwXnY9tmch3m-E-gg75er5ymd7zIMIHuT5zuvjgQwdXlv-na-azk4cALZYtibc__OP_tvprdezsN5_i-Mx99ns282UZCDKT3nukUNNUWNp_m_)\n[![ask](https://img.shields.io/badge/ask-Harbor-252421?style=flat\u0026labelColor=555)](https://textclip.sh?failmsg=It%20appears%20that%20you%20do%20not%20have%20Harbor%20installed%2C%20or%20Open%20WebUI%20is%20not%20running%20on%20default%20port\u0026redirect=http%3A%2F%2Flocalhost%3A33801%3Fq%3D__TEXT__#c=TZTNjiM1EIDveQofga2N4LJCkQZpZ2AAaVZCzBNU29XdtSm7PLY7P2QjjYQUshyjleC6J8S8Bnd4h34S5E4GcbH6p7r8lb-q_g5To2l8__Hm7vvx-PQ6xsXXapeUbtRHzTQeTqIW5e7ujRni-OvPa2oGfqEi6PEFpm41m_39YfETJX1pNbTcjcfHNScOnfnCejceTu0gkgvapcGhTFGBbBmPj7truIX7vYlJWxa6X2M09La-O5xyweBQNJApQwgk4-HEoVAKVMxDGg8nrw0LmfH4p0Xbk_nn8ZOK86npORfz12_2XIB5dz0en27H49P9u69efT6bXV_tzvQwrTZGWIl4KB1DwabZYmTA2Cd1XAg855JQUobcCYYOliVhyK0mTymDeEdRdAvIqWYJtEFYaqPiwKsbBJPHDeRIFTJDxCSU9rdXu-kgQbhJZHssUJf6QJvzfX-hrF9MFw1rYNvFAhi2pefQ1Q1rbGAFz8shogMtlDxY9e124P391S4Tpk3oIFKKQhu2CI7bLUTB4GgDwoVqolpcO2QCjRwK2NaB9a6HFpvEFpAdJUCe2KrICiKeVijwdojbQgk0UugxuFyTyrawzZAoakS7hPBlgEalQORIwoEyPLiEocDyFcSkPpZWFdbUFI2gPnDElClBK7rmTGdA0TVoLFyJvabYs4X8IBNVF4snKAmp5SUIFgolW40ECbta5tl4XmM8n_tkssaBp4LeRvA2KohLMDU9MmBHodTuroLXhCsCp1Y4dNAkXWdK9ciGkEVLD2sOzrMINBRsD41qLvvZbFI9_vLh2lxkjIfTxf_F-nMp_5MkLu0XP77-1jz3zmKlbMlc1C7Yd-aCPR7_qOCLNzc_jIfTlHg22w0RnK4DxCohF0wFRLsMtCELuaeKObC4SRsMSeAhwXnY9tmch3m-E-gg75er5ymd7zIMIHuT5zuvjgQwdXlv-na-azk4cALZYtibc__OP_tvprdezsN5_i-Mx99ns282UZCDKT3nukUNNUWNp_m_)\n\n\nSetup your local LLM stack effortlessly.\n\n```bash\n# Starts fully configured Open WebUI and Ollama\nharbor up\n\n# Now, Open WebUI can do Web RAG and TTS/STT\nharbor up searxng speaches\n```\n\nHarbor is a CLI and companion app that lets you spin up a complete local LLM stack—backends like Ollama, llama.cpp, or vLLM, frontends like Open WebUI, plus supporting services like SearXNG for web search, Speaches for voice chat, and ComfyUI for image generation—all pre-wired to work together with a single `harbor up` command. No manual setup: just pick the services you want and Harbor handles the Docker Compose orchestration, configuration, and cross-service connectivity so you can focus on actually using your models.\n\n![Screenshot of Harbor CLI and App together](https://github.com/av/harbor/wiki/harbor-app-3.png)\n\n\u003e **🔄 Migrating to Harbor 0.4.0?**\n\u003e\n\u003e Harbor 0.4.0 introduces a new directory structure where all service files are organized in a `services/` directory for better maintainability and a cleaner root directory.\n\u003e\n\u003e **Existing users:** Run `harbor migrate --dry-run` to preview changes, then `harbor migrate` to upgrade. See the [Migration Guide](docs/0.4.0-Migration-Guide.md) for details.\n\u003e\n\u003e **New users:** No action needed—everything is already organized!\n\n## Documentation\n\n- [Installing Harbor](https://github.com/av/harbor/wiki/1.0.-Installing-Harbor)\u003cbr/\u003e\n  Guides to install Harbor CLI and App\n- [**Migration Guide (v0.4.0)**](docs/0.4.0-Migration-Guide.md)\u003cbr/\u003e\n  **Important:** Guide for upgrading to Harbor 0.4.0 with the new directory structure\n- [Harbor User Guide](https://github.com/av/harbor/wiki/1.-Harbor-User-Guide)\u003cbr/\u003e\n  High-level overview of working with Harbor\n- [Harbor App](https://github.com/av/harbor/wiki/1.1-Harbor-App)\u003cbr/\u003e\n  Overview and manual for the Harbor companion application\n- [Harbor Services](https://github.com/av/harbor/wiki/2.-Services)\u003cbr/\u003e\n  Catalog of services available in Harbor\n- [Harbor CLI Reference](https://github.com/av/harbor/wiki/3.-Harbor-CLI-Reference)\u003cbr/\u003e\n  Read more about Harbor CLI commands and options.\n  Read about supported services and the ways to configure them.\n- [Join our Discord](https://discord.gg/8nDRphrhSF)\u003cbr/\u003e\n  Get help, share your experience, and contribute to the project.\n\n## What can Harbor do?\n\n![Diagram outlining Harbor's service structure](https://raw.githubusercontent.com/wiki/av/harbor/harbor-arch-diag.png)\n\n\n#### ✦ Local LLMs\n\nRun LLMs and related services locally, with no or minimal configuration, typically in a single command or click.\n\n```bash\n# All backends are pre-connected to Open WebUI\nharbor up ollama\nharbor up llamacpp\nharbor up vllm\n\n# Set and remember args for llama.cpp\nharbor llamacpp args -ngl 32\n```\n\n####  Cutting Edge Inference\n\nHarbor supports most of the major inference engines as well as a few of the lesser-known ones.\n\n```bash\n# We sincerely hope you'll never try to run all of them at once\nharbor up vllm llamacpp tgi litellm tabbyapi aphrodite sglang ktransformers mistralrs airllm\n```\n\n#### Tool Use\n\nEnjoy the benefits of MCP ecosystem, extend it to your use-cases.\n\n```bash\n# Manage MCPs with a convenient Web UI\nharbor up metamcp\n\n# Connect MCPs to Open WebUI\nharbor up metamcp mcpo\n```\n\n#### Generate Images\n\nHarbor includes ComfyUI + Flux + Open WebUI integration.\n\n```bash\n# Use FLUX in Open WebUI in one command\nharbor up comfyui\n```\n\n#### Local Web RAG / Deep Research\n\nHarbor includes [SearXNG](./docs/2.3.1-Satellite\u0026colon-SearXNG.md) that is pre-connected to a lot of services out of the box: [Perplexica](./docs/2.3.2-Satellite\u0026colon-Perplexica.md), [ChatUI](./docs/2.1.4-Frontend\u0026colon-ChatUI.md), [Morphic](./docs/2.3.34-Satellite-Morphic.md), [Local Deep Research](./docs/2.3.45-Satellite-Local-Deep-Research.md) and more.\n\n```bash\n# SearXNG is pre-connected to Open WebUI\nharbor up searxng\n\n# And to many other services\nharbor up searxng chatui\nharbor up searxng morphic\nharbor up searxng perplexica\nharbor up searxng ldr\n```\n\n#### LLM Workflows\n\nHarbor includes multiple services for build LLM-based data and chat workflows: [Dify](./docs/2.3.3-Satellite\u0026colon-Dify.md), [LitLytics](./docs/2.3.21-Satellite\u0026colon-LitLytics.md), [n8n](./docs/2.3.23-Satellite\u0026colon-n8n.md), [Open WebUI Pipelines](./docs/2.3.25-Satellite\u0026colon-Open-WebUI-Pipelines.md), [FloWise](./docs/2.3.31-Satellite\u0026colon-Flowise.md), [LangFlow](./docs/2.3.32-Satellite\u0026colon-LangFlow.md)\n\n```bash\n# Use Dify in Open WebUI\nharbor up dify\n```\n\n#### Talk to your LLM\n\nSetup voice chats with your LLM in a single command. Open WebUI + Speaches\n\n```bash\n# Speaches includes OpenAI-compatible SST and TTS\n# and connected to Open WebUI out of the box\nharbor up speaches\n```\n\n#### Chat from the phone\n\nYou can access Harbor services from your phone with a QR code. Easily get links for local, LAN or Docker access.\n\n```bash\n# Print a QR code to open the service on your phone\nharbor qr\n# Print a link to open the service on your phone\nharbor url webui\n```\n\n#### Chat from anywhere\n\nHarbor includes a [built-in tunneling service](./docs/3.-Harbor-CLI-Reference.md#harbor-tunnel-service) to expose your Harbor to the internet.\n\n\u003e [!WARN]\n\u003e Be careful exposing your computer to the Internet, it's not safe.\n\n```bash\n# Expose default UI to the internet\nharbor tunnel\n\n# Expose a specific service to the internet\n# ⚠️ Ensure to configure authentication for the service\nharbor tunnel vllm\n\n# Harbor comes with traefik built-in and pre-configured\n# for all included services\nharbor up traefik\n```\n\n#### LLM Scripting\n\n[Harbor Boost](./docs/5.2.-Harbor-Boost.md) allows you to [easily script workflows](./docs/5.2.1.-Harbor-Boost-Custom-Modules.md) and interactions with downstream LLMs.\n\n```bash\n# Use Harbor Boost to script LLM workflows\nharbor up boost\n```\n\n#### Config Profiles\n\nSave and manage configuration profiles for different scenarios. For example - save [llama.cpp](./docs/2.2.2-Backend\u0026colon-llama.cpp.md) args for different models and contexts and switch between them easily.\n\n```bash\n# Save and use config profiles\nharbor profile save llama4\nharbor profile use default\n\n# Import profiles from a URL\nharbor profile use https://example.com/path/to/harbor-profile.env\n```\n\n#### Command History\n\nHarbor keeps a [local-only history of recent commands](./docs/3.-Harbor-CLI-Reference.md#harbor-history). Look up and re-run easily, standalone from the system shell history.\n\n```bash\n# Lookup recently used harbor commands\nharbor history\n```\n\n#### Eject\n\nReady to move to your own setup? Harbor [will give you](./docs/3.-Harbor-CLI-Reference.md#harbor-eject) a docker-compose file replicating your setup.\n\n```bash\n# Eject from Harbor into a standalone Docker Compose setup\n# Will export related services and variables into a standalone file.\nharbor eject searxng llamacpp \u003e docker-compose.harbor.yml\n```\n\n---\n\n## Services\n\n##### UIs\n[Open WebUI](https://github.com/av/harbor/wiki/2.1.1-Frontend:-Open-WebUI) ⦁︎\n[ComfyUI](https://github.com/av/harbor/wiki/2.1.2-Frontend:-ComfyUI) ⦁︎\n[LibreChat](https://github.com/av/harbor/wiki/2.1.3-Frontend:-LibreChat) ⦁︎\n[HuggingFace ChatUI](https://github.com/av/harbor/wiki/2.1.4-Frontend:-ChatUI) ⦁︎\n[Lobe Chat](https://github.com/av/harbor/wiki/2.1.5-Frontend:-Lobe-Chat) ⦁︎\n[Hollama](https://github.com/av/harbor/wiki/2.1.6-Frontend:-hollama) ⦁︎\n[parllama](https://github.com/av/harbor/wiki/2.1.7-Frontend:-parllama) ⦁︎\n[BionicGPT](https://github.com/av/harbor/wiki/2.1.8-Frontend:-BionicGPT) ⦁︎\n[AnythingLLM](https://github.com/av/harbor/wiki/2.1.9-Frontend:-AnythingLLM) ⦁︎\n[Chat Nio](https://github.com/av/harbor/wiki/2.1.10-Frontend:-Chat-Nio) ⦁︎\n[mikupad](https://github.com/av/harbor/wiki/2.1.11-Frontend:-Mikupad) ⦁︎\n[oterm](https://github.com/av/harbor/wiki/2.1.12-Frontend-oterm) ⦁︎\n[omnichain](https://github.com/av/harbor/wiki/2.3.16-Satellite:-omnichain) ⦁︎\n[ol1](https://github.com/av/harbor/wiki/2.3.19-Satellite:-ol1)\n\n##### Backends\n[Ollama](https://github.com/av/harbor/wiki/2.2.1-Backend:-Ollama) ⦁︎\n[llama.cpp](https://github.com/av/harbor/wiki/2.2.2-Backend:-llama.cpp) ⦁︎\n[vLLM](https://github.com/av/harbor/wiki/2.2.3-Backend:-vLLM) ⦁︎\n[TabbyAPI](https://github.com/av/harbor/wiki/2.2.4-Backend:-TabbyAPI) ⦁︎\n[Aphrodite Engine](https://github.com/av/harbor/wiki/2.2.5-Backend:-Aphrodite-Engine) ⦁︎\n[mistral.rs](https://github.com/av/harbor/wiki/2.2.6-Backend:-mistral.rs) ⦁︎\n[openedai-speech](https://github.com/av/harbor/wiki/2.2.7-Backend:-openedai-speech) ⦁︎\n[Speaches](https://github.com/av/harbor/wiki/2.2.14-Backend:-Speaches) ⦁︎\n[Parler](https://github.com/av/harbor/wiki/2.2.8-Backend:-Parler) ⦁︎\n[text-generation-inference](https://github.com/av/harbor/wiki/2.2.9-Backend:-text-generation-inference) ⦁︎\n[LMDeploy](https://github.com/av/harbor/wiki/2.2.10-Backend:-lmdeploy) ⦁︎\n[AirLLM](https://github.com/av/harbor/wiki/2.2.11-Backend:-AirLLM) ⦁︎\n[SGLang](https://github.com/av/harbor/wiki/2.2.12-Backend:-SGLang) ⦁︎\n[KTransformers](https://github.com/av/harbor/wiki/2.2.13-Backend:-KTransformers) ⦁︎\n[Nexa SDK](https://github.com/av/harbor/wiki/2.2.15-Backend:-Nexa-SDK) ⦁︎\n[KoboldCpp](https://github.com/av/harbor/wiki/2.2.16-Backend:-KoboldCpp) ⦁︎\n[Modular MAX](https://github.com/av/harbor/wiki/2.2.17-Backend-Modular-MAX)\n\n##### Satellites\n[Harbor Bench](https://github.com/av/harbor/wiki/5.1.-Harbor-Bench) ⦁︎\n[Harbor Boost](https://github.com/av/harbor/wiki/5.2.-Harbor-Boost) ⦁︎\n[SearXNG](https://github.com/av/harbor/wiki/2.3.1-Satellite:-SearXNG) ⦁︎\n[Perplexica](https://github.com/av/harbor/wiki/2.3.2-Satellite:-Perplexica) ⦁︎\n[Dify](https://github.com/av/harbor/wiki/2.3.3-Satellite:-Dify) ⦁︎\n[Plandex](https://github.com/av/harbor/wiki/2.3.4-Satellite:-Plandex) ⦁︎\n[LiteLLM](https://github.com/av/harbor/wiki/2.3.5-Satellite:-LiteLLM) ⦁︎\n[LangFuse](https://github.com/av/harbor/wiki/2.3.6-Satellite:-langfuse) ⦁︎\n[Open Interpreter](https://github.com/av/harbor/wiki/2.3.7-Satellite:-Open-Interpreter) ⦁\n︎[cloudflared](https://github.com/av/harbor/wiki/2.3.8-Satellite:-cloudflared) ⦁︎\n[cmdh](https://github.com/av/harbor/wiki/2.3.9-Satellite:-cmdh) ⦁︎\n[fabric](https://github.com/av/harbor/wiki/2.3.10-Satellite:-fabric) ⦁︎\n[txtai RAG](https://github.com/av/harbor/wiki/2.3.11-Satellite:-txtai-RAG) ⦁︎\n[TextGrad](https://github.com/av/harbor/wiki/2.3.12-Satellite:-TextGrad) ⦁︎\n[Aider](https://github.com/av/harbor/wiki/2.3.13-Satellite:-aider) ⦁︎\n[aichat](https://github.com/av/harbor/wiki/2.3.14-Satellite:-aichat) ⦁︎\n[autogpt](https://github.com/av/harbor/wiki/2.3.15-Satellite:-AutoGPT) ⦁︎\n[lm-evaluation-harness](https://github.com/av/harbor/wiki/2.3.17-Satellite:-lm-evaluation-harness) ⦁︎\n[JupyterLab](https://github.com/av/harbor/wiki/2.3.18-Satellite:-JupyterLab) ⦁︎\n[ol1](https://github.com/av/harbor/wiki/2.3.19-Satellite:-ol1) ⦁︎\n[OpenHands](https://github.com/av/harbor/wiki/2.3.20-Satellite:-OpenHands) ⦁︎\n[LitLytics](https://github.com/av/harbor/wiki/2.3.21-Satellite:-LitLytics) ⦁︎\n[Repopack](https://github.com/av/harbor/wiki/2.3.22-Satellite:-Repopack) ⦁︎\n[n8n](https://github.com/av/harbor/wiki/2.3.23-Satellite:-n8n) ⦁︎\n[Bolt.new](https://github.com/av/harbor/wiki/2.3.24-Satellite:-Bolt.new) ⦁︎\n[Open WebUI Pipelines](https://github.com/av/harbor/wiki/2.3.25-Satellite:-Open-WebUI-Pipelines) ⦁︎\n[Qdrant](https://github.com/av/harbor/wiki/2.3.26-Satellite:-Qdrant) ⦁︎\n[K6](https://github.com/av/harbor/wiki/2.3.27-Satellite:-K6) ⦁︎\n[Promptfoo](https://github.com/av/harbor/wiki/2.3.28-Satellite:-Promptfoo) ⦁︎\n[Webtop](https://github.com/av/harbor/wiki/2.3.29-Satellite:-Webtop) ⦁︎\n[OmniParser](https://github.com/av/harbor/wiki/2.3.30-Satellite:-OmniParser) ⦁︎\n[Flowise](https://github.com/av/harbor/wiki/2.3.31-Satellite:-Flowise) ⦁︎\n[Langflow](https://github.com/av/harbor/wiki/2.3.32-Satellite:-LangFlow) ⦁︎\n[OptiLLM](https://github.com/av/harbor/wiki/2.3.33-Satellite:-OptiLLM) ⦁︎\n[Morphic](https://github.com/av/harbor/wiki/2.3.34-Satellite-Morphic) ⦁︎\n[SQL Chat](https://github.com/av/harbor/wiki/2.3.35-Satellite-SQL-Chat) ⦁︎\n[gptme](https://github.com/av/harbor/wiki/2.3.36-Satellite-gptme) ⦁︎\n[traefik](https://github.com/av/harbor/wiki/2.3.37-Satellite-traefik) ⦁︎\n[Latent Scope](https://github.com/av/harbor/wiki/2.3.38-Satellite-Latent-Scope) ⦁︎\n[RAGLite](https://github.com/av/harbor/wiki/2.3.39-Satellite-RAGLite) ⦁︎\n[llama-swap](https://github.com/av/harbor/wiki/2.3.40-Satellite-llamaswap) ⦁︎\n[LibreTranslate](https://github.com/av/harbor/wiki/2.3.41-Satellite-LibreTranslate) ⦁︎\n[MetaMCP](https://github.com/av/harbor/wiki/2.3.42-Satellite-MetaMCP) ⦁︎\n[mcpo](https://github.com/av/harbor/wiki/2.3.43-Satellite-mcpo) ⦁︎\n[SuperGateway](https://github.com/av/harbor/wiki/2.3.44-Satellite-supergateway) ⦁︎\n[Local Deep Research](https://github.com/av/harbor/wiki/2.3.45-Satellite-Local-Deep-Research) ⦁︎\n[LocalAI](https://github.com/av/harbor/wiki/2.3.46-Satellite-LocalAI) ⦁︎\n[AgentZero](https://github.com/av/harbor/wiki/2.3.47-Satellite-Agent-Zero) ⦁︎\n[Airweave](https://github.com/av/harbor/wiki/2.3.48-Satellite-Airweave) ⦁︎\n[Docling](https://github.com/av/harbor/wiki/2.3.49-Satellite-Docling) ⦁︎\n[Browser Use](https://github.com/av/harbor/wiki/2.3.50-Satellite-Browser-Use) ⦁︎\n[Unsloth](https://github.com/av/harbor/wiki/2.3.51-Satellite-Unsloth) ⦁︎\n[Windmill](https://github.com/av/harbor/wiki/2.3.52-Satellite-Windmill)\n\n\nSee [services documentation](https://github.com/av/harbor/wiki/2.-Services) for a brief overview of each.\n\n## CLI Tour\n\n```bash\n# Run Harbor with default services:\n# Open WebUI and Ollama\nharbor up\n\n# Run Harbor with additional services\n# Running SearXNG automatically enables Web RAG in Open WebUI\nharbor up searxng\n\n# Speaches includes OpenAI-compatible SST and TTS\n# and connected to Open WebUI out of the box\nharbor up speaches\n\n# Run additional/alternative LLM Inference backends\n# Open Webui is automatically connected to them.\nharbor up llamacpp tgi litellm vllm tabbyapi aphrodite sglang ktransformers\n\n# Run different Frontends\nharbor up librechat chatui bionicgpt hollama\n\n# Get a free quality boost with\n# built-in optimizing proxy\nharbor up boost\n\n# Use FLUX in Open WebUI in one command\nharbor up comfyui\n\n# Use custom models for supported backends\nharbor llamacpp model https://huggingface.co/user/repo/model.gguf\n\n# Access service CLIs without installing them\n# Caches are shared between services where possible\nharbor hf scan-cache\nharbor hf download google/gemma-2-2b-it\nharbor ollama list\n\n# Shortcut to HF Hub to find the models\nharbor hf find gguf gemma-2\n# Use HFDownloader and official HF CLI to download models\nharbor hf dl -m google/gemma-2-2b-it -c 10 -s ./hf\nharbor hf download google/gemma-2-2b-it\n\n# Where possible, cache is shared between the services\nharbor tgi model google/gemma-2-2b-it\nharbor vllm model google/gemma-2-2b-it\nharbor aphrodite model google/gemma-2-2b-it\nharbor tabbyapi model google/gemma-2-2b-it-exl2\nharbor mistralrs model google/gemma-2-2b-it\nharbor opint model google/gemma-2-2b-it\nharbor sglang model google/gemma-2-2b-it\n\n# Convenience tools for docker setup\nharbor logs llamacpp\nharbor exec llamacpp ./scripts/llama-bench --help\nharbor shell vllm\n\n# Tell your shell exactly what you think about it\nharbor opint\nharbor aider\nharbor aichat\nharbor cmdh\n\n# Use fabric to LLM-ify your linux pipes\ncat ./file.md | harbor fabric --pattern extract_extraordinary_claims | grep \"LK99\"\n\n# Open services from the CLI\nharbor open webui\nharbor open llamacpp\n# Print yourself a QR to quickly open the\n# service on your phone\nharbor qr\n# Feeling adventurous? Expose your Harbor\n# to the internet\nharbor tunnel\n\n# Config management\nharbor config list\nharbor config set webui.host.port 8080\n\n# Create and manage config profiles\nharbor profile save l370b\nharbor profile use default\n# Import profile from a URL\nharbor profile use https://example.com/path/to/harbor-profile.env\n\n# Lookup recently used harbor commands\nharbor history\n\n# Eject from Harbor into a standalone Docker Compose setup\n# Will export related services and variables into a standalone file.\nharbor eject searxng llamacpp \u003e docker-compose.harbor.yml\n\n# Run a built-in LLM benchmark with\n# your own tasks\nharbor bench run\n\n# Gimmick/Fun Area\n\n# Argument scrambling, below commands are all the same as above\n# Harbor doesn't care if it's \"vllm model\" or \"model vllm\", it'll\n# figure it out.\nharbor model vllm\nharbor vllm model\n\nharbor config get webui.name\nharbor get config webui_name\n\nharbor tabbyapi shell\nharbor shell tabbyapi\n\n# 50% gimmick, 50% useful\n# Ask harbor about itself\nharbor how to ping ollama container from the webui?\n```\n\n## Harbor App Demo\n\nhttps://github.com/user-attachments/assets/a5cd2ef1-3208-400a-8866-7abd85808503\n\nIn the demo, Harbor App is used to launch a default stack with [Ollama](./2.2.1-Backend:-Ollama) and [Open WebUI](./2.1.1-Frontend:-Open-WebUI) services. Later, [SearXNG](./2.3.1-Satellite:-SearXNG) is also started, and WebUI can connect to it for the Web RAG right out of the box. After that, [Harbor Boost](./5.2.-Harbor-Boost) is also started and connected to the WebUI automatically to induce more creative outputs. As a final step, Harbor config is adjusted in the App for the [`klmbr`](./5.2.-Harbor-Boost#klmbr---boost-llm-creativity) module in the [Harbor Boost](./5.2.-Harbor-Boost), which makes the output unparsable for the LLM (yet still undetstandable for humans).\n\n## Why?\n\n- If you're comfortable with Docker and Linux administration - you likely don't need Harbor to manage your local LLM environment. However, while growing it - you're also likely to eventually arrive to a similar solution. I know this for a fact, since that's exactly how Harbor came to be.\n- Harbor is not designed as a deployment solution, but rather as a helper for the local LLM development environment. It's a good starting point for experimenting with LLMs and related services.\n- Workflow/setup centralisation - you can be sure where to find a specific config or service, logs, data and configuration files.\n- Convenience factor - single CLI with a lot of services and features, accessible from anywhere on your host.\n\n## Supporters\n\n![@av's wife](https://ui-avatars.com/api/?size=32\u0026name=KN\u0026rounded=true\u0026background=ffaaaa\u0026color=ff4444)\n![@burnth3heretic](https://ui-avatars.com/api/?size=32\u0026name=BTH\u0026rounded=true)\n![@vood](https://ui-avatars.com/api/?size=32\u0026name=VD\u0026rounded=true)\n![@anonymous](https://ui-avatars.com/api/?size=32\u0026name=🥷\u0026rounded=true\u0026background=bada55)\n","funding_links":["https://ko-fi.com/harbor"],"categories":["Container Operations","Contributing","TypeScript","Software","📚 Projects (1974 total)","Other","Recently Updated","Building","MCP Clients","SDKs \u0026 Libraries"],"sub_categories":["Container Composition","Generative Artificial Intelligence (GenAI)","Tools \u0026 Libraries","Music","[Oct 28, 2024](/content/2024/10/28/README.md)","Tools","CLI Tools","Emacs"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fav%2Fharbor","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fav%2Fharbor","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fav%2Fharbor/lists"}