https://github.com/joshuasundance-swca/langchain-streamlit-demo
langchain-streamlit demo with streaming llm, memory, and langsmith feedback
https://github.com/joshuasundance-swca/langchain-streamlit-demo
anthropic anyscale docker langchain openai streamlit
Last synced: 3 months ago
JSON representation
langchain-streamlit demo with streaming llm, memory, and langsmith feedback
- Host: GitHub
- URL: https://github.com/joshuasundance-swca/langchain-streamlit-demo
- Owner: joshuasundance-swca
- License: mit
- Created: 2023-09-18T04:32:58.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2024-11-23T06:02:39.000Z (11 months ago)
- Last Synced: 2024-11-23T07:17:26.147Z (11 months ago)
- Topics: anthropic, anyscale, docker, langchain, openai, streamlit
- Language: Python
- Homepage: https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo
- Size: 337 KB
- Stars: 18
- Watchers: 6
- Forks: 10
- Open Issues: 10
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
---
title: langchain-streamlit-demo
emoji: 🦜
colorFrom: green
colorTo: red
sdk: docker
app_port: 7860
pinned: true
tags: [langchain, streamlit, docker]
---# langchain-streamlit-demo
[](https://opensource.org/licenses/MIT)
[](https://www.python.org)[](https://github.com/joshuasundance-swca/langchain-streamlit-demo/actions/workflows/docker-hub.yml)
[](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo)[](https://github.com/joshuasundance-swca/langchain-streamlit-demo/actions/workflows/hf-space.yml)
[](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)

[](https://github.com/pre-commit/pre-commit)
[](https://github.com/charliermarsh/ruff)
[](http://mypy-lang.org/)
[](https://github.com/psf/black)[](https://github.com/PyCQA/bandit)
[](https://github.com/joshuasundance-swca/langchain-streamlit-demo/actions/workflows/ai_changelog.yml)
This project shows how to build a simple chatbot UI with [Streamlit](https://streamlit.io) and [LangChain](https://langchain.com).
This `README` was originally written by [Claude 2](https://www.anthropic.com/index/claude-2), an LLM from [Anthropic](https://www.anthropic.com/).
# Features
- Chat interface for talking to AI assistant
- Supports models from
- [OpenAI](https://openai.com/)
- `gpt-3.5-turbo`
- `gpt-4`
- [Anthropic](https://www.anthropic.com/)
- `claude-instant-v1`
- `claude-2`
- [Anyscale Endpoints](https://endpoints.anyscale.com/)
- `meta-llama/Llama-2-7b-chat-hf`
- `meta-llama/Llama-2-13b-chat-hf`
- `meta-llama/Llama-2-70b-chat-hf`
- `codellama/CodeLlama-34b-Instruct-hf`
- `mistralai/Mistral-7B-Instruct-v0.1`
- [Azure OpenAI Service](https://azure.microsoft.com/en-us/products/ai-services/openai-service/)
- `[configurable]`
- Streaming output of assistant responses
- Leverages LangChain for dialogue and memory management
- Integrates with [LangSmith](https://smith.langchain.com) for tracing conversations
- Allows giving feedback on assistant's responses
- Tries reading API keys and default values from environment variables
- Parameters in sidebar can be customized
- Includes various forms of document chat
- Question/Answer Pair Generation
- Summarization
- Standard retrieval chains# Deployment
`langchain-streamlit-demo` is deployed as a [Docker image](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo) based on the [`python:3.11-slim-bookworm`](https://github.com/docker-library/python/blob/81b6e5f0643965618d633cd6b811bf0879dee360/3.11/slim-bookworm/Dockerfile) image.
CI/CD workflows in `.github/workflows` handle building and publishing the image as well as pushing it to Hugging Face.## Run on HuggingFace Spaces
[](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo)## With Docker (pull from Docker Hub)
1. _Optional_: Create a `.env` file based on `.env-example`
2. Run in terminal:`docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest`
or
`docker run -p 7860:7860 --env-file .env joshuasundance/langchain-streamlit-demo:latest`
3. Open http://localhost:7860 in your browser
## Docker Compose (build locally)
1. Clone the repo. Navigate to cloned repo directory
2. _Optional_: Create a `.env` file based on `.env-example`
3. Run in terminal:`docker compose up`
4. Open http://localhost:7860 in your browser
## Kubernetes
1. Clone the repo. Navigate to cloned repo directory
2. Create a `.env` file based on `.env-example`
3. Run bash script: `/bin/bash ./kubernetes/deploy.sh`
4. Get the IP address for your new service: `kubectl get service langchain-streamlit-demo`# Links
- [Streamlit](https://streamlit.io)
- [LangChain](https://langchain.com)
- [LangSmith](https://smith.langchain.com)
- [OpenAI](https://openai.com/)
- [Anthropic](https://www.anthropic.com/)
- [Anyscale Endpoints](https://endpoints.anyscale.com/)
- [Azure OpenAI Service](https://azure.microsoft.com/en-us/products/ai-services/openai-service/)