https://github.com/mahmoudmabrok/chatwithcode
Local chat with your local git based repository, built using Ollama, HuggingFaceEmbeddings, LangChain
https://github.com/mahmoudmabrok/chatwithcode
deepseek-coder huggingfaceembeddings langchain llm ollama
Last synced: about 1 month ago
JSON representation
Local chat with your local git based repository, built using Ollama, HuggingFaceEmbeddings, LangChain
- Host: GitHub
- URL: https://github.com/mahmoudmabrok/chatwithcode
- Owner: MahmoudMabrok
- License: agpl-3.0
- Created: 2025-01-05T11:59:04.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2025-01-14T21:01:02.000Z (5 months ago)
- Last Synced: 2025-03-29T22:04:59.387Z (2 months ago)
- Topics: deepseek-coder, huggingfaceembeddings, langchain, llm, ollama
- Language: Python
- Homepage:
- Size: 1.47 MB
- Stars: 7
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Chat with Code
Local chat with your local git based repository, built using Ollama, HuggingFaceEmbeddings, LangChain.## Introduction
This project provides ready to use chat with already codebase that was locally on machine, so it is readt to run and use. To use it please update knowledge base with your repo then run it again.## Why this project?
I needed to have ability to chat with my private repo (suitable for company that seek privacy) so i developed this app.## Demo
As example i used next [repo](https://github.com/MahmoudMabrok/QuranyApp) as knowledge base and result :
## How to use
- install **[Ollama](https://ollama.com)**
- pull model `deepseek-coder-v2` using `ollama pull deepseek-coder-v2:16b`
- install python 3.11
- run `pip3 install -r requirements.txt`
- run `python3.11 -m streamlit run app.py`## Update Knowledge base
To update KB we need to chunck codebase and make embededings then save to **FAISS**you need to add next snippits(let's assume codebase is JS based project):
```python
# path to local git repo
repo_path = "./project/"
repo = Repo(repo_path)# Extract text from the repository
file_texts = []
for file in repo.tree().traverse():
if file.type == "blob":
if file.name.endswith((".js", ".tsx", ".jsx")): # Handle JavaScript files
file_content = file.data_stream.read().decode("utf-8")js_splitter = RecursiveCharacterTextSplitter.from_language(
language=Language.JS, chunk_size=200, chunk_overlap=0
)
chunks = js_splitter.create_documents([file_content])
file_texts.extend([chunk.page_content for chunk in chunks])# place this code after embeddings
vectorstore = FAISS.from_texts(file_texts, embeddings)vectorstore.save_local("vector_store_index")
```
then when `vectorstore` is saved you can comment above lines, and use:
```python
vectorstore = FAISS.load_local("vector_store_index",embeddings, allow_dangerous_deserialization=True )
```# Inspiration
I have inspired from many projects, but i did not get what i want exactly.
- [chat_with_github](https://github.com/Shubhamsaboo/awesome-llm-apps/tree/main/chat_with_X_tutorials/chat_with_github)
- [QA-Pilot](https://github.com/reid41/QA-Pilot)