{"id":23960581,"url":"https://github.com/mahmoudmabrok/chatwithcode","last_synced_at":"2025-04-23T04:22:42.917Z","repository":{"id":271091186,"uuid":"912374257","full_name":"MahmoudMabrok/ChatWithCode","owner":"MahmoudMabrok","description":"Local chat with your local git based repository, built using Ollama, HuggingFaceEmbeddings, LangChain","archived":false,"fork":false,"pushed_at":"2025-01-14T21:01:02.000Z","size":1544,"stargazers_count":7,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-03-29T22:04:59.387Z","etag":null,"topics":["deepseek-coder","huggingfaceembeddings","langchain","llm","ollama"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"agpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/MahmoudMabrok.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2025-01-05T11:59:04.000Z","updated_at":"2025-01-14T21:01:06.000Z","dependencies_parsed_at":null,"dependency_job_id":"ebc6e0a4-e960-4338-aee3-cab043aee4a1","html_url":"https://github.com/MahmoudMabrok/ChatWithCode","commit_stats":null,"previous_names":["mahmoudmabrok/chatwithcode"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MahmoudMabrok%2FChatWithCode","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MahmoudMabrok%2FChatWithCode/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MahmoudMabrok%2FChatWithCode/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MahmoudMabrok%2FChatWithCode/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/MahmoudMabrok","download_url":"https://codeload.github.com/MahmoudMabrok/ChatWithCode/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250367497,"owners_count":21418908,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deepseek-coder","huggingfaceembeddings","langchain","llm","ollama"],"created_at":"2025-01-06T19:30:07.301Z","updated_at":"2025-04-23T04:22:42.898Z","avatar_url":"https://github.com/MahmoudMabrok.png","language":"Python","readme":"# Chat with Code \nLocal chat with your local git based repository, built using Ollama, HuggingFaceEmbeddings, LangChain. \n\n\n## Introduction \nThis project provides ready to use chat with already codebase that was locally on machine, so it is readt to run and use. To use it please update knowledge base with your repo then run it again. \n\n## Why this project? \nI needed to have ability to chat with my private repo (suitable for company that seek privacy) so i developed this app. \n\n\n## Demo \nAs example i used next [repo](https://github.com/MahmoudMabrok/QuranyApp) as knowledge base and result : \n![Screenshot 2025-01-05 at 3 31 19 PM](https://github.com/user-attachments/assets/5341074a-a535-4ba7-a8ee-852744bfac22)\n\n\n## How to use \n- install **[Ollama](https://ollama.com)**\n- pull model `deepseek-coder-v2` using `ollama pull deepseek-coder-v2:16b`\n- install python 3.11\n- run `pip3 install -r requirements.txt`\n- run `python3.11 -m streamlit run app.py`\n\n## Update Knowledge base \nTo update KB we need to chunck codebase and make embededings then save to **FAISS**\n\nyou need to add next snippits(let's assume codebase is JS based project): \n\n```python\n# path to local git repo\nrepo_path = \"./project/\"\nrepo = Repo(repo_path)\n\n# Extract text from the repository\nfile_texts = []\nfor file in repo.tree().traverse():\n    if file.type == \"blob\":\n        if file.name.endswith((\".js\", \".tsx\", \".jsx\")):  # Handle JavaScript files\n            file_content = file.data_stream.read().decode(\"utf-8\")\n\n            js_splitter = RecursiveCharacterTextSplitter.from_language(\n                language=Language.JS, chunk_size=200, chunk_overlap=0\n            )\n            chunks = js_splitter.create_documents([file_content])\n            file_texts.extend([chunk.page_content for chunk in chunks])            \n\n\n# place this code after embeddings\nvectorstore = FAISS.from_texts(file_texts, embeddings)\n\nvectorstore.save_local(\"vector_store_index\")\n\n\n```\nthen when `vectorstore` is saved you can comment above lines, and use: \n```python\nvectorstore = FAISS.load_local(\"vector_store_index\",embeddings, allow_dangerous_deserialization=True )\n```\n\n# Inspiration\nI have inspired from many projects, but i did not get what i want exactly. \n- [chat_with_github](https://github.com/Shubhamsaboo/awesome-llm-apps/tree/main/chat_with_X_tutorials/chat_with_github)\n- [QA-Pilot](https://github.com/reid41/QA-Pilot)\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmahmoudmabrok%2Fchatwithcode","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmahmoudmabrok%2Fchatwithcode","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmahmoudmabrok%2Fchatwithcode/lists"}