{"id":13422972,"url":"https://github.com/serge-chat/serge","last_synced_at":"2025-04-23T22:59:20.759Z","repository":{"id":144399214,"uuid":"615973283","full_name":"serge-chat/serge","owner":"serge-chat","description":"A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.","archived":false,"fork":false,"pushed_at":"2025-04-23T09:03:59.000Z","size":3573,"stargazers_count":5719,"open_issues_count":29,"forks_count":402,"subscribers_count":48,"default_branch":"main","last_synced_at":"2025-04-23T22:59:11.043Z","etag":null,"topics":["alpaca","docker","fastapi","llama","llamacpp","nginx","python","svelte","sveltekit","tailwindcss","web"],"latest_commit_sha":null,"homepage":"https://serge.chat","language":"Svelte","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/serge-chat.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE-APACHE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-03-19T08:33:29.000Z","updated_at":"2025-04-23T09:36:42.000Z","dependencies_parsed_at":"2023-09-27T15:23:37.725Z","dependency_job_id":"a2411299-50cb-43a4-b2f5-bc96c97f6561","html_url":"https://github.com/serge-chat/serge","commit_stats":{"total_commits":1137,"total_committers":36,"mean_commits":"31.583333333333332","dds":"0.31398416886543534","last_synced_commit":"ac798505ee7ea29dc9533fdde8f3d63738573d4f"},"previous_names":["serge-chat/serge","nsarrazin/serge"],"tags_count":28,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/serge-chat%2Fserge","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/serge-chat%2Fserge/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/serge-chat%2Fserge/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/serge-chat%2Fserge/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/serge-chat","download_url":"https://codeload.github.com/serge-chat/serge/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250528728,"owners_count":21445514,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["alpaca","docker","fastapi","llama","llamacpp","nginx","python","svelte","sveltekit","tailwindcss","web"],"created_at":"2024-07-30T23:01:01.147Z","updated_at":"2025-04-23T22:59:20.724Z","avatar_url":"https://github.com/serge-chat.png","language":"Svelte","readme":"# Serge - LLaMA made easy 🦙\n\n![License](https://img.shields.io/github/license/serge-chat/serge)\n[![Discord](https://img.shields.io/discord/1088427963801948201?label=Discord)](https://discord.gg/62Hc6FEYQH)\n\nSerge is a chat interface crafted with [llama.cpp](https://github.com/ggerganov/llama.cpp) for running LLM models. No API keys, entirely self-hosted!\n\n- 🌐 **SvelteKit** frontend\n- 💾 **[Redis](https://github.com/redis/redis)** for storing chat history \u0026 parameters\n- ⚙️ **FastAPI + LangChain** for the API, wrapping calls to [llama.cpp](https://github.com/ggerganov/llama.cpp) using the [python bindings](https://github.com/abetlen/llama-cpp-python)\n\n🎥 Demo:\n\n[demo.webm](https://user-images.githubusercontent.com/25119303/226897188-914a6662-8c26-472c-96bd-f51fc020abf6.webm)\n\n## ⚡️ Quick start\n\n🐳 Docker:\n\n```bash\ndocker run -d \\\n    --name serge \\\n    -v weights:/usr/src/app/weights \\\n    -v datadb:/data/db/ \\\n    -p 8008:8008 \\\n    ghcr.io/serge-chat/serge:latest\n```\n\n🐙 Docker Compose:\n\n```yaml\nservices:\n  serge:\n    image: ghcr.io/serge-chat/serge:latest\n    container_name: serge\n    restart: unless-stopped\n    ports:\n      - 8008:8008\n    volumes:\n      - weights:/usr/src/app/weights\n      - datadb:/data/db/\n\nvolumes:\n  weights:\n  datadb:\n```\n\nThen, just visit http://localhost:8008, You can find the API documentation at http://localhost:8008/api/docs\n\n### 🌍 Environment Variables\n\nThe following Environment Variables are available:\n\n| Variable Name         | Description                                             | Default Value                        |\n|-----------------------|---------------------------------------------------------|--------------------------------------|\n| `SERGE_DATABASE_URL`  | Database connection string                              | `sqlite:////data/db/sql_app.db`      |\n| `SERGE_JWT_SECRET`    | Key for auth token encryption. Use a random string      | `uF7FGN5uzfGdFiPzR`                  |\n| `SERGE_SESSION_EXPIRY`| Duration in minutes before a user must reauthenticate   | `60`                                 |\n| `NODE_ENV`            | Node.js running environment                             | `production`                         |\n\n## 🖥️ Windows\n\nEnsure you have Docker Desktop installed, WSL2 configured, and enough free RAM to run models.\n\n## ⚠️ Memory Usage\n\nLLaMA will crash if you don't have enough available memory for the model\n\n## 💬 Support\n\nNeed help? Join our [Discord](https://discord.gg/62Hc6FEYQH)\n\n## 🧾 License\n\n[Nathan Sarrazin](https://github.com/nsarrazin) and [Contributors](https://github.com/serge-chat/serge/graphs/contributors). `Serge` is free and open-source software licensed under the [MIT License](https://github.com/serge-chat/serge/blob/main/LICENSE-MIT) and [Apache-2.0](https://github.com/serge-chat/serge/blob/main/LICENSE-APACHE).\n\n## 🤝 Contributing\n\nIf you discover a bug or have a feature idea, feel free to open an issue or PR.\n\nTo run Serge in development mode:\n\n```bash\ngit clone https://github.com/serge-chat/serge.git\ncd serge/\ndocker compose -f docker-compose.dev.yml up --build\n```\n\nThe solution will accept a python debugger session on port 5678. Example launch.json for VSCode:\n\n```json\n{\n    \"version\": \"0.2.0\",\n    \"configurations\": [\n        {\n            \"name\": \"Remote Debug\",\n            \"type\": \"python\",\n            \"request\": \"attach\",\n            \"connect\": {\n                \"host\": \"localhost\",\n                \"port\": 5678\n            },\n            \"pathMappings\": [\n                {\n                    \"localRoot\": \"${workspaceFolder}/api\",\n                    \"remoteRoot\": \"/usr/src/app/api/\"\n                }\n            ],\n            \"justMyCode\": false\n        }\n    ]\n}\n```\n","funding_links":[],"categories":["[TavernAI/TavernAI](https://github.com/TavernAI/TavernAI)","Svelte","Python","A01_文本生成_文本对话","LLM Deployment","📚 Contents","HarmonyOS","Apps","Tools for Self-Hosting","LLM Applications","web","LLMs ChatUI"],"sub_categories":["数据","大语言对话模型及数据","Windows Manager","AI","Running Locally on Windows, MacOS, and Linux:"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fserge-chat%2Fserge","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fserge-chat%2Fserge","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fserge-chat%2Fserge/lists"}