{"id":22889295,"url":"https://github.com/dmayboroda/minima","last_synced_at":"2025-05-15T09:06:31.083Z","repository":{"id":260852257,"uuid":"882486617","full_name":"dmayboroda/minima","owner":"dmayboroda","description":"On-premises conversational RAG with configurable containers","archived":false,"fork":false,"pushed_at":"2025-04-09T10:14:07.000Z","size":657,"stargazers_count":693,"open_issues_count":8,"forks_count":65,"subscribers_count":12,"default_branch":"main","last_synced_at":"2025-04-10T06:39:13.514Z","etag":null,"topics":["ai","claude","custom-gpts","docker","docker-compose","huggingface","langchain","mcp","model-context-protocol","ollama","qdrant","sentence-transformers"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mpl-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/dmayboroda.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-11-02T22:44:56.000Z","updated_at":"2025-04-10T02:53:01.000Z","dependencies_parsed_at":"2024-11-03T04:16:42.071Z","dependency_job_id":"711f8e40-3034-481f-99b2-b943f0f01825","html_url":"https://github.com/dmayboroda/minima","commit_stats":null,"previous_names":["dmayboroda/minima"],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dmayboroda%2Fminima","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dmayboroda%2Fminima/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dmayboroda%2Fminima/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dmayboroda%2Fminima/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/dmayboroda","download_url":"https://codeload.github.com/dmayboroda/minima/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254310515,"owners_count":22049469,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","claude","custom-gpts","docker","docker-compose","huggingface","langchain","mcp","model-context-protocol","ollama","qdrant","sentence-transformers"],"created_at":"2024-12-13T21:28:08.481Z","updated_at":"2025-05-15T09:06:31.054Z","avatar_url":"https://github.com/dmayboroda.png","language":"Python","readme":"\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://www.mnma.ai/\" target=\"blank\"\u003e\u003cimg src=\"assets/logo-full.svg\" width=\"300\" alt=\"MNMA Logo\" /\u003e\u003c/a\u003e\n\u003c/p\u003e\n\n**Minima** is an open source RAG on-premises containers, with ability to integrate with ChatGPT and MCP. \nMinima can also be used as a fully local RAG.\n\nMinima currently supports three modes:\n1. Isolated installation – Operate fully on-premises with containers, free from external dependencies such as ChatGPT or Claude. All neural networks (LLM, reranker, embedding) run on your cloud or PC, ensuring your data remains secure.\n\n2. Custom GPT – Query your local documents using ChatGPT app or web with custom GPTs. The indexer running on your cloud or local PC, while the primary LLM remains ChatGPT.\n\n3. Anthropic Claude – Use Anthropic Claude app to query your local documents. The indexer operates on your local PC, while Anthropic Claude serves as the primary LLM.\n\n### Running as containers\n\n1. Create a .env file in the project’s root directory (where you’ll find env.sample). Place .env in the same folder and copy all environment variables from env.sample to .env.\n\n2. Ensure your .env file includes the following variables:\n\u003cul\u003e\n   \u003cli\u003e LOCAL_FILES_PATH \u003c/li\u003e\n   \u003cli\u003e EMBEDDING_MODEL_ID \u003c/li\u003e\n   \u003cli\u003e EMBEDDING_SIZE \u003c/li\u003e\n   \u003cli\u003e OLLAMA_MODEL \u003c/li\u003e\n   \u003cli\u003e RERANKER_MODEL \u003c/li\u003e\n   \u003cli\u003e USER_ID \u003c/li\u003e - required for ChatGPT integration, just use your email\n   \u003cli\u003e PASSWORD \u003c/li\u003e - required for ChatGPT integration, just use any password\n\u003c/ul\u003e\n\n3. For fully local installation use: **docker compose -f docker-compose-ollama.yml --env-file .env up --build**.\n\n4. For ChatGPT enabled installation use: **docker compose -f docker-compose-chatgpt.yml --env-file .env up --build**.\n\n5. For MCP integration (Anthropic Desktop app usage): **docker compose -f docker-compose-mcp.yml --env-file .env up --build**.\n\n6. In case of ChatGPT enabled installation copy OTP from terminal where you launched docker and use [Minima GPT](https://chatgpt.com/g/g-r1MNTSb0Q-minima-local-computer-search)  \n\n7. If you use Anthropic Claude, just add folliwing to **/Library/Application\\ Support/Claude/claude_desktop_config.json**\n\n```\n{\n    \"mcpServers\": {\n      \"minima\": {\n        \"command\": \"uv\",\n        \"args\": [\n          \"--directory\",\n          \"/path_to_cloned_minima_project/mcp-server\",\n          \"run\",\n          \"minima\"\n        ]\n      }\n    }\n  }\n```\n   \n8. To use fully local installation go to `cd electron`, then run `npm install` and `npm start` which will launch Minima electron app.\n\n9. Ask anything, and you'll get answers based on local files in {LOCAL_FILES_PATH} folder.\n\nExplanation of Variables:\n\n**LOCAL_FILES_PATH**: Specify the root folder for indexing (on your cloud or local pc). Indexing is a recursive process, meaning all documents within subfolders of this root folder will also be indexed. Supported file types: .pdf, .xls, .docx, .txt, .md, .csv.\n\n**EMBEDDING_MODEL_ID**: Specify the embedding model to use. Currently, only Sentence Transformer models are supported. Testing has been done with sentence-transformers/all-mpnet-base-v2, but other Sentence Transformer models can be used.\n\n**EMBEDDING_SIZE**: Define the embedding dimension provided by the model, which is needed to configure Qdrant vector storage. Ensure this value matches the actual embedding size of the specified EMBEDDING_MODEL_ID.\n\n**OLLAMA_MODEL**: Set up the Ollama model, use an ID available on the Ollama [site](https://ollama.com/search). Please, use LLM model here, not an embedding.\n\n**RERANKER_MODEL**: Specify the reranker model. Currently, we have tested with BAAI rerankers. You can explore all available rerankers using this [link](https://huggingface.co/collections/BAAI/).\n\n**USER_ID**: Just use your email here, this is needed to authenticate custom GPT to search in your data.\n\n**PASSWORD**: Put any password here, this is used to create a firebase account for the email specified above.\n\n\nExample of .env file for on-premises/local usage:\n```\nLOCAL_FILES_PATH=/Users/davidmayboroda/Downloads/PDFs/\nEMBEDDING_MODEL_ID=sentence-transformers/all-mpnet-base-v2\nEMBEDDING_SIZE=768\nOLLAMA_MODEL=qwen2:0.5b # must be LLM model id from Ollama models page\nRERANKER_MODEL=BAAI/bge-reranker-base # please, choose any BAAI reranker model\n```\n\nTo use a chat ui, please navigate to **http://localhost:3000**\n\nExample of .env file for Claude app:\n```\nLOCAL_FILES_PATH=/Users/davidmayboroda/Downloads/PDFs/\nEMBEDDING_MODEL_ID=sentence-transformers/all-mpnet-base-v2\nEMBEDDING_SIZE=768\n```\nFor the Claude app, please apply the changes to the claude_desktop_config.json file as outlined above.\n\nExample of .env file for ChatGPT custom GPT usage:\n```\nLOCAL_FILES_PATH=/Users/davidmayboroda/Downloads/PDFs/\nEMBEDDING_MODEL_ID=sentence-transformers/all-mpnet-base-v2\nEMBEDDING_SIZE=768\nUSER_ID=user@gmail.com # your real email\nPASSWORD=password # you can create here password that you want\n```\n\nAlso, you can run minima using **run.sh**.\n\n### Installing via Smithery (MCP usage)\n\nTo install Minima for Claude Desktop automatically via [Smithery](https://smithery.ai/protocol/minima):\n\n```bash\nnpx -y @smithery/cli install minima --client claude\n```\n\n**For MCP usage, please be sure that your local machines python is \u003e=3.10 and 'uv' installed.**\n\nMinima (https://github.com/dmayboroda/minima) is licensed under the Mozilla Public License v2.0 (MPLv2).\n","funding_links":[],"categories":["👥 Community Contributions","AI Memory \u0026 RAG","MCP 服务器精选列表","📚 Projects (1974 total)","Community Servers","HarmonyOS","Open Source Projects","Other Tools and Integrations","📦 Other","Tools for Self-Hosting","Table of Contents","Document Processing","MCP Servers","Containerised MCP Servers","Python","🗂️ Extensions by Category","MCP Servers \u0026 Protocol"],"sub_categories":["AI \u0026 Search","RAG","🧠 知识、记忆与 RAG","MCP Servers","Windows Manager","Knowledge Management","LLMs","AI Services","Knowledge \u0026 Memory","DevOps \u0026 Infrastructure","🧠 Knowledge Base"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdmayboroda%2Fminima","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdmayboroda%2Fminima","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdmayboroda%2Fminima/lists"}