{"id":21864171,"url":"https://github.com/apconw/sanic-web","last_synced_at":"2025-05-16T09:04:05.994Z","repository":{"id":263393749,"uuid":"888874620","full_name":"apconw/sanic-web","owner":"apconw","description":"一个轻量级、支持全链路且易于二次开发的大模型应用项目(Large Model Data Assistant) 支持DeepSeek/Qwen2.5等大模型 基于 Dify 、Ollama\u0026Vllm、Sanic 和 Text2SQL 📊 等技术构建的一站式大模型应用开发项目，采用 Vue3、TypeScript 和 Vite 5 打造现代UI。它支持通过 ECharts 📈 实现基于大模型的数据图形化问答，具备处理 CSV 文件 📂 表格问答的能力。同时，能方便对接第三方开源 RAG 系统 检索系统 🌐等，以支持广泛的通用知识问答。","archived":false,"fork":false,"pushed_at":"2025-05-09T11:07:22.000Z","size":151440,"stargazers_count":571,"open_issues_count":5,"forks_count":99,"subscribers_count":10,"default_branch":"master","last_synced_at":"2025-05-09T12:24:02.058Z","etag":null,"topics":["ai","bigdata","chat","chatgpt","deepseek-r1","dify","echarts","large-model-data-assistant","llm","ollama","python","qwen","rag","sanic","text2sql","vllm","vue3"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/apconw.png","metadata":{"files":{"readme":"README-en.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-11-15T07:08:43.000Z","updated_at":"2025-05-09T09:52:21.000Z","dependencies_parsed_at":"2025-01-14T09:23:27.386Z","dependency_job_id":"d7e467cd-8990-452c-9cc8-c9be67769139","html_url":"https://github.com/apconw/sanic-web","commit_stats":null,"previous_names":["apconw/sanic-web"],"tags_count":4,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apconw%2Fsanic-web","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apconw%2Fsanic-web/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apconw%2Fsanic-web/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apconw%2Fsanic-web/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/apconw","download_url":"https://codeload.github.com/apconw/sanic-web/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254501557,"owners_count":22081528,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","bigdata","chat","chatgpt","deepseek-r1","dify","echarts","large-model-data-assistant","llm","ollama","python","qwen","rag","sanic","text2sql","vllm","vue3"],"created_at":"2024-11-28T04:07:44.394Z","updated_at":"2025-05-16T09:04:04.534Z","avatar_url":"https://github.com/apconw.png","language":"JavaScript","readme":"# Large Model Data Assistant\n\n[![中文文档](https://img.shields.io/badge/中文文档-点击查看-orange)](README.md)\n\n🌟 **Project Introduction**\n\nA lightweight, full - link supported, and easily customizable large model application project.\n\n**Compatible with large models such as DeepSeek and Qwen2.5**\n\nThis is a one - stop large model application development project built on technologies like Dify, Ollama \u0026 Vllm, Sanic, and Text2SQL 📊. It features a modern UI crafted with Vue3, TypeScript, and Vite 5. It supports data graphical Q\u0026A based on large models through ECharts 📈 and can handle tabular Q\u0026A for CSV files 📂. Additionally, it can be easily integrated with third - party open - source RAG systems and retrieval systems 🌐 to support a wide range of general knowledge Q\u0026A.\n\nAs a lightweight large model application development project, Sanic - Web 🛠️ supports rapid iteration and expansion, facilitating the quick implementation of large model projects. 🚀\n\n## Architecture Diagram\n![image](./images/app-01.png)\n\n## 🎉 **Features**\n- **Core Technology Stack**: Dify + Ollama + RAG + (Qwen2.5/DeepSeek) + Text2SQL\n- **UI Framework**: Vue 3 + TypeScript + Vite 5\n- **Data Q\u0026A**: Integrates ECharts and large models to achieve lightweight graphical data Q\u0026A display via Text2SQL.\n- **Table Q\u0026A**: Supports uploading CSV files and provides table data Q\u0026A based on large model summarization, preprocessing, and Text2SQL.\n- **General Q\u0026A**: Supports general data Q\u0026A through integration with third - party RAG systems and public network retrieval.\n- **Application Architecture**: Serves as a lightweight, full - link, one - stop large model application development framework for easy expansion and implementation.\n- **Flexible Deployment**: Supports one - click deployment of all dependent components for large model application development using docker - compose, with zero configuration required.\n\n## Demo\n![image](./images/chat-04.gif)\n![image](./images/chat-05.png)\n![image](./images/chat-01.png)\n![image](./images/chat-02.png)\n\n## 💡 Environment Requirements\n\nBefore you start, make sure your development environment meets the following minimum requirements:\n\n- **Operating System**: Windows 10/11, macOS M series, Centos/Ubuntu\n- **GPU**: For local deployment with Ollama, an NVIDIA GPU is recommended. Alternatively, you can use the CPU mode or purchase an API key from the public network.\n- **Memory**: 8GB+\n\n## 🔧 **Prerequisites**\n* Python 3.8+\n* Poetry 1.8.3+\n* Dify 0.7.1+\n* Mysql 8.0+\n* Node.js 18.12.x+\n* Pnpm 9.x\n\n## 📚 **Large Model Deployment**\n- [Refer to Ollama Deployment](https://qwen.readthedocs.io/en/latest/run_locally/ollama.html)\n- Models: Qwen2.5 7B model\n- Models: DeepSeek R1 7B model\n- [Alibaba Cloud Public Network API Key](http://aliyun.com/product/bailian)\n\n## ⚙️ **Dify Environment Configuration**\n1. **Install Dify**\n   - [Official Documentation](https://docs.dify.ai/en)\n   - To assist those new to large model applications, this project provides a one - click solution to start the Dify service for a quick experience.\n   - Local access address for Dify: http://localhost:18000. You need to register your own account and password.\n   ```bash\n   # Start the built - in Dify service\n   cd docker/dify/docker\n   docker - compose up -d\n\n2. **Configure Dify**\n- Add the Ollama large model provider in Dify and configure the Qwen2.5 and DeepSeek R1 models.\n- Import the docker/dify/数据问答_v1.1.2_deepseek.yml canvas from the project root directory.\n- Copy the API key corresponding to the canvas for use in the following steps.\n- After importing the canvas, manually select the locally configured large model and save the settings.\n  \n![image](./images/llm-setting.png)\n![image](./images/llm-setting-deepseek.png)\n![image](./images/import-convas.png)\n![image](./images/convas-api-key.png)\n\n## 🚀 Quick Start\nThe specific steps are as follows:\n\n1. Clone the repository\n```bash\ngit clone https://github.com/apconw/sanic - web.git\n```\n\n2. Start the service\n\n- Modify the environment variables starting with DIFY_ in the chat - service of docker - compose.\n- Set the DIFY_DATABASE_QA_API_KEY to the API key obtained from the Dify canvas.\n\n```bash\n# Start the front - end, back - end services, and middleware\ncd docker\ndocker compose up -d\n```\n3. Configure Minio\n   \n- Access the MinIO service at http://localhost:19001/. The default account is admin, and the password is 12345678.\n- Create a bucket named filedata and configure the Access Key.\n- Modify the environment variables starting with MINIO_ in the chat - service of docker - compose and restart the service.\n\n```bash\n# Restart the front - end, back - end services, and middleware\ncd docker\ndocker compose up -d\n```\n\n4. Initialize the data\n\n```bash\n# Install the dependency package\npip install pymysql\n\n# For Mac or Linux users\ncd docker\n./init.sh\n\n# For Windows users\ncd common\npython initialize_mysql.py\n```\n\n5. Access the application\n- Front - end service: http://localhost:8081\n\n\n## 🛠️ Local Development\n- Clone the repository\n- Deploy large models: Refer to the above Large Model Deployment section to install Ollama and deploy the Qwen2.5 and DeepSeek R1 models.\n- Configure Dify: Refer to the above Dify Environment Configuration section to obtain the API key from the Dify canvas and modify the DIFY_DATABASE_QA_API_KEY in the .env.dev file.\n- Configure Minio: Modify the Minio - related keys in the .env.dev file.\n- Install dependencies and start the services\n\n1. **Back - end dependencies**\n- Install Poetry: [Refer to the official Poetry documentation](https://python - poetry.org/docs/)\n\n```bash\n# Install Poetry\npip install poetry\n\n# Install dependencies at the root directory\n# Set up the domestic mirror\npoetry source add --priority = primary mirrors https://pypi.tuna.tsinghua.edu.cn/simple/\npoetry install --no - root\n```\n\n2. **Install middleware**\n```bash \ncd docker\ndocker compose up -d mysql minio\n```\n3. **Configure Minio**\n- Access the MinIO service at http://localhost:19001/. The default account is admin, and the password is 12345678.\n- Create a bucket named filedata and configure the Access Key.\n- Modify the Minio - related keys in the .env.dev file.\n\n4. **Initialize the data**\n- If you are using a local MySQL environment, you need to modify the database connection information in the initialize_mysql source code.\n```bash\n# For Mac or Linux users\ncd docker\n./init.sh\n\n# For Windows users\ncd common\npython initialize_mysql.py\n```\n\n5. **Front - end dependencies**\n- The front - end is based on the open - source project [chatgpt - vue3 - light - mvp](https://github.com/pdsuwwz/chatgpt - vue3 - light - mvp).\n\n```bash \n# Install front - end dependencies and start the service\ncd web\n\n# Install pnpm globally\nnpm install -g pnpm\n\n# Install dependencies\npnpm i\n\n# Start the service\npnpm dev\n```\n\n6. **Start the back - end service**\n```bash\n# Start the back - end service\npython serv.py\n```\n\n7. **Access the service**\n- Front - end service: http://localhost:2048\n\n## 🐳 Build Images\n- Execute the following commands to build the images:\n```bash\n# Build the front - end image\nmake web - build\n\n# Build the back - end image\nmake server - build\n```\n\n## 🌹 Support\nIf you find this project useful, please give it a star on GitHub by clicking the [`Star`](https://github.com/apconw/sanic-web)  button. Your support is our motivation to keep improving. Thank you! ^_^\n\n\n## ⭐ Star History\n [![Star History Chart](https://api.star-history.com/svg?repos=apconw/sanic-web\u0026type=Date)](https://star-history.com/#apconw/sanic-web\u0026Date)\n\n\n## QA Community\n- Join our large model application community to discuss and share experiences.\n- Follow the official WeChat account and click the \"WeChat Group\" menu to join.\n\n|               微信群                |\n| :---------------------------------: |\n| ![image](./images/wchat-search.png) |\n\n\n## License\nMIT License | Copyright © 2024 - PRESENT AiAdventurer","funding_links":[],"categories":["Chatbots","📚 Projects (1974 total)","JavaScript"],"sub_categories":["MCP Servers"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fapconw%2Fsanic-web","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fapconw%2Fsanic-web","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fapconw%2Fsanic-web/lists"}