https://github.com/ersinaksar/langchain-streamlit-rag
https://github.com/ersinaksar/langchain-streamlit-rag
fastapi langchain langchain-python language-model llm ollama ollama-python python python3 rag streamlit
Last synced: 3 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/ersinaksar/langchain-streamlit-rag
- Owner: ersinaksar
- License: mit
- Created: 2024-07-09T20:59:35.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2025-02-22T05:45:39.000Z (4 months ago)
- Last Synced: 2025-02-22T06:26:53.163Z (4 months ago)
- Topics: fastapi, langchain, langchain-python, language-model, llm, ollama, ollama-python, python, python3, rag, streamlit
- Language: Python
- Homepage:
- Size: 5.86 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# LangChain-Streamlit-RAG
This project provides a FastAPI interface to serve Ollama models. It allows other applications to send queries to the model and receive responses.
[The Future of AI Knowledge Retrieval: Inside LangChain-Streamlit-RAG](https://app.readytensor.ai/publications/the-future-of-ai-knowledge-retrieval-inside-langchain-streamlit-rag-IrZGq54S2ryw)
## Features- **LangChain**: Manages and processes language models.
- **Ollama**: The core language model used for generating responses.
- **RAG**: RAG for pdf.## Installation
1. **Clone the repository:**
```sh
git clone https://github.com/ersinaksar/LangChain-Streamlit-RAG.git
cd serve-ollama-models
```2. **Create a virtual environment and activate it:**
```sh
python3 -m venv .venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`
```3. **Install the dependencies:**
```sh
pip install -r requirements.txt
```## Usage
1. **Start the streamlit application:**
```sh
streamlit run main.py
```## License
This project is licensed under the MIT License.