https://github.com/stoyan-stoyanov/llmflows-ws-template
https://github.com/stoyan-stoyanov/llmflows-ws-template
Last synced: 7 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/stoyan-stoyanov/llmflows-ws-template
- Owner: stoyan-stoyanov
- Created: 2023-11-26T18:30:51.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-11-27T04:21:07.000Z (almost 2 years ago)
- Last Synced: 2025-01-28T15:42:13.844Z (8 months ago)
- Language: HTML
- Homepage: https://llmflows-ws-template.vercel.app
- Size: 46.9 KB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# 🤖 llmflows-ws-template
Live Demo: https://llmflows-ws-template-96085e33adde.herokuapp.com
## Introduction
This is a template repository for building RAG chat apps with llmflows.The app is built with FastAPI and uses websockets to communicate with the frontend.
The frontend uses HTML, CSS and JavaScript and Bootstrap for styling.
The app uses the qa_flow.py file to create a flow that can be used to answer questions.
The flow is created in the answer_question function and the answer_question function
is used as an endpoint to receive questions and return answers.## Prerequisites
The app requires a Pinecone index to be created and populated.
To create a Pinecone index, follow the instructions in LLMFlows official [user guide](https://llmflows.readthedocs.io/en/latest/user_guide/Vector%20Stores/)## Running locally
To run the app locally make sure to create a virtual environment and install the
requirements.txt file.```
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
```Start the app by running the following command in the terminal:
```
uvicorn app:app --reload
```## Deploying to Heroku
To deploy the app to Heroku, you will have to create a Heroku account, fork this
repository and connect it to your Heroku account. Make sure you include the following
envuronment variables in your Heroku app:```
PINECONE_API_KEY:
OPENAI_API_KEY:
```## 📃 License
This repository is covered by the MIT license. For more information, check [`LICENCE`](https://github.com/stoyan-stoyanov/llmflows-ws-template/blob/main/LICENSE).