https://github.com/joshuasundance-swca/langchain-research-assistant-docker
docker setup to run the LangChain research-assistant template using langserve
https://github.com/joshuasundance-swca/langchain-research-assistant-docker
docker langchain research-assistant
Last synced: 3 months ago
JSON representation
docker setup to run the LangChain research-assistant template using langserve
- Host: GitHub
- URL: https://github.com/joshuasundance-swca/langchain-research-assistant-docker
- Owner: joshuasundance-swca
- License: mit
- Created: 2023-11-27T19:00:00.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2024-06-26T03:48:54.000Z (over 1 year ago)
- Last Synced: 2024-06-26T04:47:42.897Z (over 1 year ago)
- Topics: docker, langchain, research-assistant
- Language: Dockerfile
- Homepage: https://hub.docker.com/repository/docker/joshuasundance/langchain-research-assistant-docker
- Size: 43.9 KB
- Stars: 43
- Watchers: 3
- Forks: 7
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# langchain-research-assistant-docker
[](https://opensource.org/licenses/MIT)
[](https://www.python.org)[](https://github.com/joshuasundance-swca/langchain-research-assistant-docker/actions/workflows/docker-hub.yml)
[](https://hub.docker.com/r/joshuasundance/langchain-research-assistant-docker)
[](https://hub.docker.com/r/joshuasundance/langchain-research-assistant-docker)[](https://github.com/pre-commit/pre-commit)
[](https://github.com/charliermarsh/ruff)
[](http://mypy-lang.org/)
[](https://github.com/psf/black)[](https://github.com/PyCQA/bandit)
This repo provides a docker setup to run the LangChain research-assistant template using langserve.
- [Relevant LangChain documentation](https://python.langchain.com/docs/templates/research-assistant)
- Example LangSmith traces
- [_What is the average territory size of the Florida Scrub-Jay in Central Florida?_](https://smith.langchain.com/public/cf52fc9f-5800-4279-b61b-e15221d3a5e3/r)
- [_Does SWCA Environmental Consultants use AI and data science?_](https://smith.langchain.com/public/fcae93da-b87e-49a6-992c-d5034bcf82e8/r)
- [_Write a comprehensive report on the history of Chuluota, Florida._](https://smith.langchain.com/public/16d1b9e8-8e01-458e-98d6-64e93c916941/r)## Quickstart
### Using Docker
```bash
docker run -d --name langchain-research-assistant-docker \
-e OPENAI_API_KEY=sk-... \
-e TAVILY_API_KEY=tvly-... \
-e LANGCHAIN_API_KEY=ls__... \
-e LANGCHAIN_TRACING_V2=true \
-e LANGCHAIN_PROJECT=langchain-research-assistant-docker \
-p 8000:8000 \
joshuasundance/langchain-research-assistant-docker:latest
```### Using Docker Compose
```docker-compose.yml
version: '3.8'services:
langchain-research-assistant-docker:
image: joshuasundance/langchain-research-assistant-docker:latest
container_name: langchain-research-assistant-docker
environment: # use values from .env
- "OPENAI_API_KEY=${OPENAI_API_KEY:?OPENAI_API_KEY is not set}" # required
- "TAVILY_API_KEY=${TAVILY_API_KEY}" # optional
- "LANGCHAIN_API_KEY=${LANGCHAIN_API_KEY}" # optional
- "LANGCHAIN_TRACING_V2=${LANGCHAIN_TRACING_V2:-false}" # false by default
- "LANGCHAIN_PROJECT=${LANGCHAIN_PROJECT:-langchain-research-assistant-docker}"
ports:
- "${APP_PORT:-8000}:8000"
```### Kubernetes
The following assumes you have a `.env` file and a Kubernetes cluster running and `kubectl` configured to access it.
It creates a secret called `research-assistant-secret`. To use a different name, edit `./kubernetes/resources.yaml` as well.
You can also edit the file and uncomment certain lines to deploy on private endpoints, with a predefined IP, etc.
```bash
kubectl create secret generic research-assistant-secret --from-env-file=.env
kubectl apply -f ./kubernetes/resources.yaml
```All deployment options are flexible and configurable.
## Usage
- The service will be available at `http://localhost:8000`.
- You can access the OpenAPI documentation at `http://localhost:8000/docs` and `http://localhost:8000/openapi.json`.
- Access the Research Playground at `http://127.0.0.1:8000/research-assistant/playground/`.- You can also use the `RemoteRunnable` class to interact with the service:
```python
from langserve.client import RemoteRunnablerunnable = RemoteRunnable("http://localhost:8000/research-assistant")
```See the [LangChain docs](https://python.langchain.com/docs/templates/research-assistant) for more information.