{"id":13563760,"url":"https://github.com/bionic-gpt/bionic-gpt","last_synced_at":"2026-01-30T12:02:05.581Z","repository":{"id":179396676,"uuid":"663396489","full_name":"bionic-gpt/bionic-gpt","owner":"bionic-gpt","description":"Bionic is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality","archived":false,"fork":false,"pushed_at":"2026-01-24T16:04:35.000Z","size":168397,"stargazers_count":2289,"open_issues_count":18,"forks_count":237,"subscribers_count":20,"default_branch":"main","last_synced_at":"2026-01-25T05:41:06.850Z","etag":null,"topics":["architecture","full-stack","llmops","llms","rust"],"latest_commit_sha":null,"homepage":"https://bionic-gpt.com","language":"Rust","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/bionic-gpt.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":"AGENTS.md","dco":null,"cla":null}},"created_at":"2023-07-07T07:45:24.000Z","updated_at":"2026-01-24T16:04:38.000Z","dependencies_parsed_at":null,"dependency_job_id":"4ef226a9-5ea2-437c-9ad2-72b69208a982","html_url":"https://github.com/bionic-gpt/bionic-gpt","commit_stats":null,"previous_names":["purton-tech/fine-tuna","purton-tech/bionicgpt","bionic-gpt/bionic-gpt"],"tags_count":415,"template":false,"template_full_name":null,"purl":"pkg:github/bionic-gpt/bionic-gpt","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bionic-gpt%2Fbionic-gpt","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bionic-gpt%2Fbionic-gpt/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bionic-gpt%2Fbionic-gpt/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bionic-gpt%2Fbionic-gpt/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/bionic-gpt","download_url":"https://codeload.github.com/bionic-gpt/bionic-gpt/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bionic-gpt%2Fbionic-gpt/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28912222,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-30T11:55:24.701Z","status":"ssl_error","status_checked_at":"2026-01-30T11:54:13.194Z","response_time":66,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["architecture","full-stack","llmops","llms","rust"],"created_at":"2024-08-01T13:01:23.014Z","updated_at":"2026-01-30T12:02:05.575Z","avatar_url":"https://github.com/bionic-gpt.png","language":"Rust","readme":"\u003ch1 align=\"center\"\u003eBionic\u003c/h1\u003e\n\u003cdiv align=\"center\"\u003e\n \u003cstrong\u003e\n   Bionic is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality\n \u003c/strong\u003e\n Bionic can run on your laptop or scale into the data center. \n\u003c/div\u003e\n\n\u003cbr /\u003e\n\n\u003cdiv align=\"center\"\u003e\n  \u003c!-- License --\u003e\n  \u003ca href=\"https://github.com/purton-tech/bionic-gpt#License\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/License-MIT-green.svg\" alt=\"License\"\u003e\n  \u003c/a\u003e\n  \u003ca href=\"https://github.com/purton-tech/bionic-gpt#License\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/License-Apache-green.svg\" alt=\"License\"\u003e\n  \u003c/a\u003e\n\u003c/div\u003e\n\n\u003cdiv align=\"center\"\u003e\n  \u003ch4\u003e\n    \u003ca href=\"https://bionic-gpt.com\"\u003e\n      Homepage\n    \u003c/a\u003e\n    |\n    \u003ca href=\"https://github.com/bionic-gpt/bionic-gpt/blob/main/CONTRIBUTING.md\"\u003e\n      Contributing\n    \u003c/a\u003e\n    |\n    \u003ca href=\"https://bionic-gpt.com/docs/\"\u003e\n      Documentation\n    \u003c/a\u003e\n  \u003c/h4\u003e\n\u003c/div\u003e\n\n\u003cbr /\u003e\n\n![Alt text](crates/static-website/assets/landing-page/bionic-console.png \"Bionic Screenshot\")\n\n\u003c!-- Features --\u003e\n\n### Run Gen AI Locally in Seconds\n\nTry our [Docker Compose](https://bionic-gpt.com/docs/running-locally/docker-compose/) installation. Ideal for running AI locally and for small pilots.\n\n### Familiar Chat Experience\n\n- 🖥️ Intuitive Interface: Our chat interface is inspired by ChatGPT to ensure a user-friendly experience.\n- 🌈 Theme Customization: The theme for Bionic is completely customizable allowing you to brand Bionic as you like.\n- ⚡ Ultra Fast UI: Enjoy fast and responsive performance from our Rust based UI.\n- 📜 Chat History: Effortlessly access and manage your conversation history.\n\n### AI Assistants (Retrieval Augmented Generation)\n\n- 🤖 AI Assistants: Users can create assistants that work with their own data to enhance the AI.\n- 🗨️ Share Assistants with Team Members: Generate and share assistants seamlessly between users, enhancing collaboration and communication.\n- 📋 Agentic RAG Pipelines: Assistants are full scale enterprise ready Agentic RAG pipelines that can be launched in minutes.\n- 📑 Any Documents: 80% of enterprise data exists in difficult-to-use formats like HTML, PDF, CSV, PNG, PPTX, and more. We support all of them.\n- 💾 No Code: Configure embeddings engine and chunking algorithms all through our UI.\n- 🗨️ System Prompts: Configure system prompts to get the LLM to reply in the way you want.\n\n### Teams\n\n- 👫 Teams: Your company is made up of Teams of people and Bionic utilises this setup for maximum effect.\n- 👫 Invite Team Members: Teams can self-manage in a controlled environment.\n- 🙋 Manage Teams: Manage who has access to Bionic with your SSO system.\n- 👬 Virtual Teams: Create teams within teams to \n- 🚠 Switch Teams: Switch between teams whilst still keeping data isolated.\n- 🚓 RBAC: Use your SSO system to configure which features users have access to.\n\n### Defence in Depth Security\n\n- 👮 SAST: Static Application Security Testing - Our CI/CD pipeline runs SAST so we can identify risks before the code is built.\n- 📢 Authorization RLS - We use Row Level Security in Postgres as another check to ensure data is not leaked between unauthorized users.\n- 🚔 CSP: Our Content Security Policy is at the highest level and stops all manner of security threats.\n- 🐳 Minimal containers: We build containers from Scratch whenever possible to limit supply chain attacks.\n- ⏳ Non root containers: We run containers as non root to limit horizontal movement during an attack.\n- 👮 Audit Trail: See who did what and when.\n- ⏰ Postgres Roles: We run the minimum level of permissions for our postgres connections.\n- 📣 SIEM integration: Integrate with your SIEM system for threat detection and investigation.\n- ⌛ Resistant to timing attacks (api keys): Coming soon.\n- 📭 SSO: We didn't build our own authentication but use industry leading and secure open source IAM systems.\n\n### Observability and Reporting\n\n- 📈 Observability API: Compatible with Prometheus for measuring load and usage.\n- 🤖 Dashboards: Create dashboards with Grafana for an overview of your whole system.\n- 📚 Monitor Chats: All questions and responses are recording and available in the Postgres database.\n\n### Token Usage Limits and Controls\n\n- 📈 Fairly share resources: Without token limits it's easy for your models to become overloaded.\n- 🔒 Reverse Proxy: All models are protected with our reverse proxy that allows you to set limits and ensure fair usage across your users.\n- 👮 Role Based: Apply token usage limits based on a users role from your IAM system.\n\n### Turn AI Assistants into APIs\n\n- 🔐 Assistants API: Any assistant you create can easily be turned into an Open AI compatible API.\n- 🔑 Key Management: Users can create API keys for assistants they have access to.\n- 🔏 Throttling limits: All API keys follow the users throttling limits ensuring fair access to the models.\n\n\n### Manage Data Governance with GuardRails\n\n- 📁 Batch Guardrails: Apply rules to documents uploaded by our batch data pipeline.\n- 🏅 Streaming Guardrails: LLMs deliver results in streams, we can apply rules in realtime as the stream flies by.\n- 👾 Prompt injection: We can guard against prompt injections attacks as well as many more.\n\n\n### Local or Remote Large Language Models\n\n- 🤖 Full support for open source models running locally or in your data center.\n- 🌟 Multiple Model Support: Install and manage as many models as you want.\n- 👾 Easy Switch: Seamlessly switch between different chat models for diverse interactions.\n- ⚙️ Many Models Conversations: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.\n\n### Role Based Access Control\n\n- ⚠️ Configurable UI: Give users access or not to certain features based on roles you give them in your IAM system.\n- 🚦 With limits: Apply token usage limits based on a users role.\n- 🎫 Fully secured: Rules are applied in our server and defence in depth secured one more time with Postgres RLS.\n\n### Data Integrations\n\n- 📤 100s of Sources: With our Airbyte integration you can batch upload data from sources such as Sharepoint, NFS, FTP, Kafka and more.\n- 📥 Batching: Run upload once a day or every hour. Set the way you want.\n- 📈 Real time: Capture data in real time to ensure your models are always using the latest data.\n- 🚆 Manual Upload: Users have the ability to manually upload data so Agentic RAG pipelines can be setup in minutes.\n- 🍟 Datasets: Data is stored in datasets and our security ensures data can't leak between users or teams.\n- 📚 OCR: We can process documents using OCR to unlock even more data.\n\n### Deploy to Bare Metal or The Cloud\n\n- 🚀 Effortless Setup: Install seamlessly using Kubernetes (k3s, Docker Desktop or the cloud) for a hassle-free experience.\n- 🌟 Continuous Updates: We are committed to improving Bionic with regular updates and new features.\n\n\u003c!-- Try it out --\u003e\n## Try it out\n\n follow [our guide](https://bionic-gpt.com/docs/) to running Bionic in production.\n\n### Configuration\n\nSet the `APP_BASE_URL` environment variable to the public URL of your Bionic server. This value is used when constructing OAuth2 callback URLs.\n\n## Architecture\n\n\n```mermaid\nflowchart LR\n    %% External actors\n    User((User))\n    IdP[\"External\u003cbr/\u003eIdentity Provider\"]\n    LLM[\"LLM Provider\u003cbr/\u003eOllama / OpenAI / Anthropic\"]\n    S3[\"S3-Compatible\u003cbr/\u003eObject Storage\"]\n\n    %% Kubernetes boundary\n    subgraph K8s[\"Kubernetes\u003cbr/\u003eEKS · AKS · GKE · k3s\"]\n        direction LR\n\n        %% Ingress \u0026 Auth\n        Nginx[\"Nginx Router\"]\n        OAuth2[\"OAuth2 Proxy\"]\n\n        %% Core API\n        RustServer[\"High Performance\u003cbr/\u003eRust Web Server\"]\n\n        %% Engines\n        RAG[\"High Performance\u003cbr/\u003eRAG Engine\"]\n        DocEngine[\"Rust Document\u003cbr/\u003eEngine\"]\n\n        %% Database\n        Postgres[\"Postgres\u003cbr/\u003eRelational DB \u0026 PgVector\"]\n    end\n\n    %% Request flow\n    User --\u003e Nginx --\u003e OAuth2 --\u003e RustServer\n\n    %% Authentication flow\n    OAuth2 --\u003e IdP\n    IdP --\u003e OAuth2\n\n    %% Control \u0026 inference\n    RustServer --\u003e RAG\n    RustServer --\u003e LLM\n    RustServer --\u003e Postgres\n\n    %% RAG orchestration\n    RAG --\u003e Postgres\n    RAG --\u003e DocEngine\n\n    %% Object storage access\n    RustServer --\u003e S3\n```\n\n## Enterprise\n\nFor companies that need better security, user management and professional support\n\n[Talk to the founders](https://calendly.com/bionicgpt)\n\nThis covers: \n- ✅ **Help with integrations**\n- ✅ **Feature Prioritization**\n- ✅ **Custom Integrations**\n- ✅ **LTS (Long Term Support) Versions**\n- ✅ **Professional Support**\n- ✅ **Custom SLAs**\n- ✅ **Secure access with Single Sign-On**\n- ✅ **Continuous Batching**\n- ✅ **Data Pipelines**\n\n# Support / talk with founders\n\n- [Schedule a Chat 👋](https://calendly.com/bionicgpt)\n- [Connect on Linked in 💭](https://www.linkedin.com/in/kulbinderdio/)\n\n# Scales to 1000's of users.\n\nBionic is optimized to run on Kubernetes and provide Generative AI services for potentially 1000's of users.\n\n![Alt text](crates/static-website/assets/readme/k9s.png \"Bionic in Kubernetes\")\n","funding_links":[],"categories":["HarmonyOS","Rust","Others","\u003cimg src=\"./assets/message-square.svg\" width=\"16\" height=\"16\" style=\"vertical-align: middle;\"\u003e Frontends","Other","Web \u0026 Desktop UIs","A01_文本生成_文本对话","architecture","\u003ca name=\"Rust\"\u003e\u003c/a\u003eRust","LLMs ChatUI"],"sub_categories":["Windows Manager","Other sdk/libraries","大语言对话模型及数据"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbionic-gpt%2Fbionic-gpt","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbionic-gpt%2Fbionic-gpt","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbionic-gpt%2Fbionic-gpt/lists"}