https://github.com/tonykipkemboi/ollama_pdf_rag
A demo Jupyter Notebook showcasing a simple local RAG (Retrieval Augmented Generation) pipeline to chat with your PDFs.
https://github.com/tonykipkemboi/ollama_pdf_rag
langchain ollama pdf rag
Last synced: 5 months ago
JSON representation
A demo Jupyter Notebook showcasing a simple local RAG (Retrieval Augmented Generation) pipeline to chat with your PDFs.
- Host: GitHub
- URL: https://github.com/tonykipkemboi/ollama_pdf_rag
- Owner: tonykipkemboi
- License: mit
- Created: 2024-04-08T17:12:47.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-01-08T14:13:35.000Z (9 months ago)
- Last Synced: 2025-05-13T07:56:45.578Z (5 months ago)
- Topics: langchain, ollama, pdf, rag
- Language: Jupyter Notebook
- Homepage: https://tonykipkemboi.github.io/ollama_pdf_rag/
- Size: 12 MB
- Stars: 395
- Watchers: 11
- Forks: 160
- Open Issues: 12
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# ๐ค Chat with PDF locally using Ollama + LangChain
A powerful local RAG (Retrieval Augmented Generation) application that lets you chat with your PDF documents using Ollama and LangChain. This project includes both a Jupyter notebook for experimentation and a Streamlit web interface for easy interaction.
[](https://github.com/tonykipkemboi/ollama_pdf_rag/actions/workflows/tests.yml)
## Project Structure
```
ollama_pdf_rag/
โโโ src/ # Source code
โ โโโ app/ # Streamlit application
โ โ โโโ components/ # UI components
โ โ โ โโโ chat.py # Chat interface
โ โ โ โโโ pdf_viewer.py # PDF display
โ โ โ โโโ sidebar.py # Sidebar controls
โ โ โโโ main.py # Main app
โ โโโ core/ # Core functionality
โ โโโ document.py # Document processing
โ โโโ embeddings.py # Vector embeddings
โ โโโ llm.py # LLM setup
โ โโโ rag.py # RAG pipeline
โโโ data/ # Data storage
โ โโโ pdfs/ # PDF storage
โ โ โโโ sample/ # Sample PDFs
โ โโโ vectors/ # Vector DB storage
โโโ notebooks/ # Jupyter notebooks
โ โโโ experiments/ # Experimental notebooks
โโโ tests/ # Unit tests
โโโ docs/ # Documentation
โโโ run.py # Application runner
```## โจ Features
- ๐ Fully local processing - no data leaves your machine
- ๐ PDF processing with intelligent chunking
- ๐ง Multi-query retrieval for better context understanding
- ๐ฏ Advanced RAG implementation using LangChain
- ๐ฅ๏ธ Clean Streamlit interface
- ๐ Jupyter notebook for experimentation## ๐ Getting Started
### Prerequisites
1. **Install Ollama**
- Visit [Ollama's website](https://ollama.ai) to download and install
- Pull required models:
```bash
ollama pull llama3.2 # or your preferred model
ollama pull nomic-embed-text
```2. **Clone Repository**
```bash
git clone https://github.com/tonykipkemboi/ollama_pdf_rag.git
cd ollama_pdf_rag
```3. **Set Up Environment**
```bash
python -m venv venv
source venv/bin/activate # On Windows: .\venv\Scripts\activate
pip install -r requirements.txt
```Key dependencies and their versions:
```txt
ollama==0.4.4
streamlit==1.40.0
pdfplumber==0.11.4
langchain==0.1.20
langchain-core==0.1.53
langchain-ollama==0.0.2
chromadb==0.4.22
```### ๐ฎ Running the Application
#### Option 1: Streamlit Interface
```bash
python run.py
```
Then open your browser to `http://localhost:8501`
*Streamlit interface showing PDF viewer and chat functionality*#### Option 2: Jupyter Notebook
```bash
jupyter notebook
```
Open `updated_rag_notebook.ipynb` to experiment with the code## ๐ก Usage Tips
1. **Upload PDF**: Use the file uploader in the Streamlit interface or try the sample PDF
2. **Select Model**: Choose from your locally available Ollama models
3. **Ask Questions**: Start chatting with your PDF through the chat interface
4. **Adjust Display**: Use the zoom slider to adjust PDF visibility
5. **Clean Up**: Use the "Delete Collection" button when switching documents## ๐ค Contributing
Feel free to:
- Open issues for bugs or suggestions
- Submit pull requests
- Comment on the YouTube video for questions
- Star the repository if you find it useful!## โ ๏ธ Troubleshooting
- Ensure Ollama is running in the background
- Check that required models are downloaded
- Verify Python environment is activated
- For Windows users, ensure WSL2 is properly configured if using Ollama### Common Errors
#### ONNX DLL Error
If you encounter this error:
```
DLL load failed while importing onnx_copy2py_export: a dynamic link Library (DLL) initialization routine failed.
```Try these solutions:
1. Install Microsoft Visual C++ Redistributable:
- Download and install both x64 and x86 versions from [Microsoft's official website](https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist)
- Restart your computer after installation2. If the error persists, try installing ONNX Runtime manually:
```bash
pip uninstall onnxruntime onnxruntime-gpu
pip install onnxruntime
```#### CPU-Only Systems
If you're running on a CPU-only system:1. Ensure you have the CPU version of ONNX Runtime:
```bash
pip uninstall onnxruntime-gpu # Remove GPU version if installed
pip install onnxruntime # Install CPU-only version
```2. You may need to modify the chunk size in the code to prevent memory issues:
- Reduce `chunk_size` to 500-1000 if you experience memory problems
- Increase `chunk_overlap` for better context preservationNote: The application will run slower on CPU-only systems, but it will still work effectively.
## ๐งช Testing
### Running Tests
```bash
# Run all tests
python -m unittest discover tests# Run tests verbosely
python -m unittest discover tests -v
```### Pre-commit Hooks
The project uses pre-commit hooks to ensure code quality. To set up:```bash
pip install pre-commit
pre-commit install
```This will:
- Run tests before each commit
- Run linting checks
- Ensure code quality standards are met### Continuous Integration
The project uses GitHub Actions for CI. On every push and pull request:
- Tests are run on multiple Python versions (3.9, 3.10, 3.11)
- Dependencies are installed
- Ollama models are pulled
- Test results are uploaded as artifacts## ๐ License
This project is open source and available under the MIT License.
---
## โญ๏ธ Star History
[](https://star-history.com/#tonykipkemboi/ollama_pdf_rag&Date)
Built with โค๏ธ by [Tony Kipkemboi!](https://tonykipkemboi.com)
Follow me on [X](https://x.com/tonykipkemboi) | [LinkedIn](https://www.linkedin.com/in/tonykipkemboi/) | [YouTube](https://www.youtube.com/@tonykipkemboi) | [GitHub](https://github.com/tonykipkemboi)