Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pors/langchain-chat-websockets
LangChain LLM chat with streaming response over websockets
https://github.com/pors/langchain-chat-websockets
async fastapi langchain langchain-python llm openai openai-api openai-chatgpt websockets
Last synced: 7 days ago
JSON representation
LangChain LLM chat with streaming response over websockets
- Host: GitHub
- URL: https://github.com/pors/langchain-chat-websockets
- Owner: pors
- License: apache-2.0
- Created: 2023-05-02T12:32:18.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-11-30T07:25:04.000Z (11 months ago)
- Last Synced: 2024-10-23T10:34:40.143Z (16 days ago)
- Topics: async, fastapi, langchain, langchain-python, llm, openai, openai-api, openai-chatgpt, websockets
- Language: HTML
- Homepage:
- Size: 18.6 KB
- Stars: 88
- Watchers: 2
- Forks: 8
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-langchain-zh - Langchain Chat Websocket - chat-websockets?style=social): 关于 LangChain LLM 聊天,通过 websockets 进行流响应 (开源项目 / 其他聊天机器人)
- awesome-langchain - Langchain Chat Websocket - chat-websockets?style=social) (Open Source Projects / Other / Chatbots)
README
### LangChain chat with streaming response over FastAPI websockets
Install and run like:
```
pip install -r requirements.txt # use a virtual env
cp dotenv-example .env # add your secrets to the .env file
uvicorn main:app --reload
```Or using docker-compose :
### Run with Docker Compose
To run the LangChain chat application using Docker Compose, follow these steps:
1. Make sure you have [Docker](https://www.docker.com/) installed on your machine.
2. Create a file named `.env` file
3. Open the newly created `.env` file in a text editor and add your OpenAI API key:
```dotenv
OPENAI_API_KEY=your_openai_api_key_here
```Replace `your_openai_api_key_here` with your actual OpenAI API key.
4. Run the following command to build the Docker image and start the FastAPI application inside a Docker container:
```bash
docker-compose up --build
```5. Access the application at [http://localhost:8000](http://localhost:8000).
Thanks to [@hwchase17](https://github.com/hwchase17) for showing the way in [chat-langchain](https://github.com/hwchase17/chat-langchain/tree/master)