https://github.com/imclumsypanda/chatchat-lite
从零开始基于 LangGraph 和 Streamlit 实现基于本地模型的 RAG、Agent 应用
https://github.com/imclumsypanda/chatchat-lite
agent chatchat chatglm glm langchain langchain-chatchat langgraph qwen rag streamlit
Last synced: 3 months ago
JSON representation
从零开始基于 LangGraph 和 Streamlit 实现基于本地模型的 RAG、Agent 应用
- Host: GitHub
- URL: https://github.com/imclumsypanda/chatchat-lite
- Owner: imClumsyPanda
- License: apache-2.0
- Created: 2024-11-23T07:41:42.000Z (8 months ago)
- Default Branch: master
- Last Pushed: 2024-12-25T12:43:17.000Z (7 months ago)
- Last Synced: 2025-04-23T16:06:58.847Z (3 months ago)
- Topics: agent, chatchat, chatglm, glm, langchain, langchain-chatchat, langgraph, qwen, rag, streamlit
- Language: Python
- Homepage:
- Size: 279 KB
- Stars: 51
- Watchers: 2
- Forks: 13
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Chatchat-lite
## 运行环境
Python >= 3.9
建议使用 3.10可参考如下命令进行环境创建
```commandline
conda create -n chatchat-lite python=3.10 -y
conda activate chatchat-lite
```## 安装依赖
```commandline
pip install -r requirements.txt
```## 启动本地模型
当前项目仅支持接入 Ollama 模型
请前往 [Ollama官网](https://ollama.com/download) 下载最新版 Ollama, 安装完成后再命令行中执行以下命令:
```commandline
ollama run qwen2.5
ollama pull quentinz/bge-large-zh-v1.5
```## 运行项目
使用以下命令行运行webui
```commandline
streamlit run st_main.py --theme.primaryColor "#165dff"
```
或使用暗色模式启动:
```commandline
streamlit run st_main.py --theme.base "dark" --theme.primaryColor "#165dff"
```启动后界面如下:
- Agent 对话界面
- 模型配置界面
