{"id":31747747,"url":"https://github.com/indiecodermm/loc-gist","last_synced_at":"2025-10-09T13:48:13.178Z","repository":{"id":309949453,"uuid":"999133598","full_name":"IndieCoderMM/loc-gist","owner":"IndieCoderMM","description":"Lightweight AI agent to chat with documents using a local RAG pipeline","archived":false,"fork":false,"pushed_at":"2025-08-14T17:54:12.000Z","size":126,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2025-08-14T19:34:46.642Z","etag":null,"topics":["chromadb","langchain","ollama","rag-chatbot","ttkbootstrap"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/IndieCoderMM.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-06-09T19:40:47.000Z","updated_at":"2025-08-14T17:54:15.000Z","dependencies_parsed_at":"2025-08-14T19:34:52.409Z","dependency_job_id":"960f8fd6-b6ab-4fc7-93ba-f6436cced362","html_url":"https://github.com/IndieCoderMM/loc-gist","commit_stats":null,"previous_names":["indiecodermm/loc-gist"],"tags_count":null,"template":false,"template_full_name":null,"purl":"pkg:github/IndieCoderMM/loc-gist","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IndieCoderMM%2Floc-gist","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IndieCoderMM%2Floc-gist/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IndieCoderMM%2Floc-gist/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IndieCoderMM%2Floc-gist/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/IndieCoderMM","download_url":"https://codeload.github.com/IndieCoderMM/loc-gist/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IndieCoderMM%2Floc-gist/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279001461,"owners_count":26083102,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-09T02:00:07.460Z","response_time":59,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["chromadb","langchain","ollama","rag-chatbot","ttkbootstrap"],"created_at":"2025-10-09T13:48:12.079Z","updated_at":"2025-10-09T13:48:13.162Z","avatar_url":"https://github.com/IndieCoderMM.png","language":"Python","readme":"# LocGist \n\n**LocGist** is a offline AI tool designed to help you quickly summarize and extract key information from your documents. It uses the Qwen 3 model to provide fast and efficient document processing without relying on external APIs.\n\n- [Tutorial](https://www.freecodecamp.org/news/build-a-local-ai/)\n\n![Screenshot](./screenshot.png)\n\n## Features\n\n- 🛡️Privacy: No data leaves your system.\n\n- 💵 No Cost: Run locally without API fees.\n\n- 🖥️Offline Capability: Process documents without internet access.\n\n- ⚙️ Customization: Use any LLM, embedding, and adapt to your needs\n\n## Getting Started\n\n### Ollama Setup 🦙\n\n**1. Install Ollama**\n- Windows: Download the installer from the Ollama website: https://ollama.com/download\n- Linux/Mac: Open a terminal and run `curl -fsSL https://ollama.com/install.sh | sh`\n\n**2. Verify Ollama Installation**\n- Open a new terminal window and run: `ollama --version`\n\n**3. Choose Your Qwen 3 Model**\n- Select a Qwen 3 model (e.g., qwen3:8b, qwen3:4b, qwen3:30b-a3b) based on your intended task and available hardware resources\n- Consider the model's size, performance, and reasoning capabilities\n\n**4. Pull and Run Qwen 3**\n- Pull the chosen Qwen 3 model with: `ollama pull \u003cmodel_tag\u003e`\n- *Interactive Mode*: `ollama run \u003cmodel_tag\u003e`\n- *Server Mode*:\n  - Start the Ollama server with: `ollama serve \u003cmodel_tag\u003e`\n  - Access the model via API at `http://localhost:11434`\n\n### Python Setup 🐍\n\n1. Create a virtual environment to manage dependencies\n```bash\npython -m venv .venv\n```\n\n2. Activate the environment\n```bash\nsource .venv/bin/activate  # Linux/Mac\nvenv\\Scripts\\activate  # Windows\n```\n\n3. Install necessary libraries using pip\n```bash\npip install langchain langchain-community langchain-core langchain-ollama chromadb pypdf ttkbootstrap\n```\n\n4. Run the app\n```bash\npython -m loc_gist\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Findiecodermm%2Floc-gist","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Findiecodermm%2Floc-gist","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Findiecodermm%2Floc-gist/lists"}