{"id":35887392,"url":"https://github.com/imClumsyPanda/Chatchat-Lite","last_synced_at":"2026-01-15T06:00:44.716Z","repository":{"id":264300211,"uuid":"892977055","full_name":"imClumsyPanda/Chatchat-Lite","owner":"imClumsyPanda","description":"从零开始基于 LangGraph 和 Streamlit 实现基于本地模型的 RAG、Agent 应用","archived":false,"fork":false,"pushed_at":"2024-12-25T12:43:17.000Z","size":286,"stargazers_count":51,"open_issues_count":2,"forks_count":13,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-04-23T16:07:09.126Z","etag":null,"topics":["agent","chatchat","chatglm","glm","langchain","langchain-chatchat","langgraph","qwen","rag","streamlit"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/imClumsyPanda.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-11-23T07:41:42.000Z","updated_at":"2025-04-02T06:15:14.000Z","dependencies_parsed_at":null,"dependency_job_id":"16ef90f7-eb5e-48aa-9998-3db3a9b755e9","html_url":"https://github.com/imClumsyPanda/Chatchat-Lite","commit_stats":null,"previous_names":["imclumsypanda/chatchat-lite"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/imClumsyPanda/Chatchat-Lite","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/imClumsyPanda%2FChatchat-Lite","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/imClumsyPanda%2FChatchat-Lite/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/imClumsyPanda%2FChatchat-Lite/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/imClumsyPanda%2FChatchat-Lite/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/imClumsyPanda","download_url":"https://codeload.github.com/imClumsyPanda/Chatchat-Lite/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/imClumsyPanda%2FChatchat-Lite/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28444124,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-15T05:05:00.929Z","status":"ssl_error","status_checked_at":"2026-01-15T05:04:58.515Z","response_time":62,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.5:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["agent","chatchat","chatglm","glm","langchain","langchain-chatchat","langgraph","qwen","rag","streamlit"],"created_at":"2026-01-08T22:00:39.536Z","updated_at":"2026-01-15T06:00:44.700Z","avatar_url":"https://github.com/imClumsyPanda.png","language":"Python","funding_links":[],"categories":["Agent Categories"],"sub_categories":["\u003ca name=\"LangGraph\"\u003e\u003c/a\u003eLangGraph"],"readme":"# Chatchat-lite\n\n## 运行环境\nPython \u003e= 3.9\n建议使用 3.10\n\n可参考如下命令进行环境创建\n```commandline\nconda create -n chatchat-lite python=3.10 -y\nconda activate chatchat-lite\n```\n\n## 安装依赖\n```commandline\npip install -r requirements.txt\n```\n\n## 启动本地模型\n当前项目仅支持接入 Ollama 模型\n请前往 [Ollama官网](https://ollama.com/download) 下载最新版 Ollama， 安装完成后再命令行中执行以下命令：\n```commandline\nollama run qwen2.5\nollama pull quentinz/bge-large-zh-v1.5\n```\n\n## 运行项目\n使用以下命令行运行webui\n```commandline\nstreamlit run st_main.py --theme.primaryColor \"#165dff\"\n```\n或使用暗色模式启动：\n```commandline\nstreamlit run st_main.py --theme.base \"dark\" --theme.primaryColor \"#165dff\"\n```\n\n启动后界面如下：\n- Agent 对话界面\n    ![webui.png](img/webui.png)\n\n- 模型配置界面\n    ![webui2.png](img/webui2.png)","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FimClumsyPanda%2FChatchat-Lite","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FimClumsyPanda%2FChatchat-Lite","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FimClumsyPanda%2FChatchat-Lite/lists"}