Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/amithkoujalgi/ollama-pdf-bot
A bot that accepts PDF docs and lets you ask questions on it.
https://github.com/amithkoujalgi/ollama-pdf-bot
bot chat-bot llama llama2 llm ollama pdf pdf-bot
Last synced: about 6 hours ago
JSON representation
A bot that accepts PDF docs and lets you ask questions on it.
- Host: GitHub
- URL: https://github.com/amithkoujalgi/ollama-pdf-bot
- Owner: amithkoujalgi
- Created: 2023-11-10T16:48:03.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-10-03T17:48:02.000Z (3 months ago)
- Last Synced: 2025-01-04T09:44:57.518Z (7 days ago)
- Topics: bot, chat-bot, llama, llama2, llm, ollama, pdf, pdf-bot
- Language: Python
- Homepage: https://hub.docker.com/r/amithkoujalgi/pdf-bot
- Size: 8.35 MB
- Stars: 173
- Watchers: 8
- Forks: 44
- Open Issues: 8
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
### PDF Bot with Ollama
A bot that accepts PDF docs and lets you ask questions on it.
The LLMs are downloaded and served via [Ollama](https://github.com/jmorganca/ollama).
## Table of Contents
- [Requirements](#requirements)
- [How to run](#how-to-run)
- [Demo](#demo)
- [Improvements](#improvements)
- [Contributing](#contributing)
- [Credits](#credits)### Requirements
[![][shield]][site]
[![][maketool-shield]][maketool-site]
[site]: https://docs.docker.com/compose/
[shield]: https://img.shields.io/badge/Docker_Compose-Installation-blue.svg?style=for-the-badge&labelColor=gray
[maketool-site]: https://www.gnu.org/software/make/
[maketool-shield]: https://img.shields.io/badge/Make-Tool-blue.svg?style=for-the-badge&labelColor=gray
### How to run
#### CPU version
```shell
make start
```#### GPU version
```shell
make start-gpu
```When the server is up and running, access the app at: http://localhost:8501
Switch to a different model by changing the `MODEL` env variable in the [docker-compose.yaml](https://github.com/amithkoujalgi/ollama-pdf-bot/blob/main/docker-compose.yml#L18). Check out the available models from [here](https://ollama.ai/library).
**Note:**
- It takes a while to start up since it downloads the specified model for the first time.
- If your hardware does not have a GPU and you choose to run only on CPU, expect high response time from the bot.
- Only Nvidia is supported as mentioned in Ollama's documentation. Others such as AMD isn't supported yet. Read how to
use GPU on [Ollama container](https://hub.docker.com/r/ollama/ollama)
and [docker-compose](https://docs.docker.com/compose/gpu-support/#:~:text=GPUs%20are%20referenced%20in%20a,capabilities%20.).
- Make sure to have Nvidia drivers setup on your execution environment for the best results.Image on DockerHub: https://hub.docker.com/r/amithkoujalgi/pdf-bot
### [Demo](https://www.youtube.com/watch?v=jJyFslR-oNQ)
https://github.com/amithkoujalgi/ollama-pdf-bot/assets/1876165/40dc70e6-9d35-4171-9ae6-d82247dbaa17
#### Sample PDFs
[Hl-L2351DW v0522.pdf](https://github.com/amithkoujalgi/ollama-pdf-bot/files/13323209/Hl-L2351DW.v0522.pdf)
[HL-B2080DW v0522.pdf](https://github.com/amithkoujalgi/ollama-pdf-bot/files/13323208/HL-B2080DW.v0522.pdf)
### Improvements
- [ ] Expose model params such as `temperature`, `top_k`, `top_p` as configurable env vars
### Contributing
Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helping
with code - any sort of contribution is much appreciated.#### Requirements
![Python](https://img.shields.io/badge/python-3.8_+-green.svg)
#### Setup Ollama server for development
```shell
docker run -it -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama
```#### Install the libs
```shell
pip install -r requirements.txt
```#### Start the app
```shell
streamlit run pdf_bot/app.py
```### Credits
Thanks to the incredible [Ollama](https://github.com/jmorganca/ollama), [Langchain](https://www.langchain.com/)
and [Streamlit](https://streamlit.io/) projects.### Appreciate my work?