Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/imclumsypanda/chatchat-lite
从零开始基于 LangGraph 和 Streamlit 实现基于本地模型的 RAG、Agent 应用
https://github.com/imclumsypanda/chatchat-lite
agent chatchat chatglm glm langchain langchain-chatchat langgraph qwen rag streamlit
Last synced: 14 days ago
JSON representation
从零开始基于 LangGraph 和 Streamlit 实现基于本地模型的 RAG、Agent 应用
- Host: GitHub
- URL: https://github.com/imclumsypanda/chatchat-lite
- Owner: imClumsyPanda
- License: apache-2.0
- Created: 2024-11-23T07:41:42.000Z (3 months ago)
- Default Branch: master
- Last Pushed: 2024-12-19T08:50:27.000Z (about 2 months ago)
- Last Synced: 2024-12-19T09:25:55.472Z (about 2 months ago)
- Topics: agent, chatchat, chatglm, glm, langchain, langchain-chatchat, langgraph, qwen, rag, streamlit
- Language: Python
- Homepage:
- Size: 275 KB
- Stars: 14
- Watchers: 2
- Forks: 7
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Chatchat-lite
## 运行环境
Python >= 3.9
建议使用 3.10可参考如下命令进行环境创建
```commandline
conda create -n chatchat-lite python=3.10 -y
conda activate chatchat-lite
```## 安装依赖
```commandline
pip install -r requirements.txt
```## 启动本地模型
当前项目仅支持接入 Ollama 模型
请前往 [Ollama官网](https://ollama.com/download) 下载最新版 Ollama, 安装完成后再命令行中执行以下命令:
```commandline
ollama run qwen2.5
ollama pull quentinz/bge-large-zh-v1.5
```## 运行项目
使用以下命令行运行webui
```commandline
streamlit run st_main.py --theme.primaryColor "#165dff"
```
或使用暗色模式启动:
```commandline
streamlit run st_main.py --theme.base "dark" --theme.primaryColor "#165dff"
```启动后界面如下:
- Agent 对话界面
![webui.png](img/webui.png)- 模型配置界面
![webui2.png](img/webui2.png)