{"id":13584813,"url":"https://github.com/modelscope/agentscope","last_synced_at":"2025-05-14T11:08:27.704Z","repository":{"id":216767016,"uuid":"742244656","full_name":"modelscope/agentscope","owner":"modelscope","description":"Start building LLM-empowered multi-agent applications in an easier way.","archived":false,"fork":false,"pushed_at":"2025-05-01T09:37:33.000Z","size":302356,"stargazers_count":7270,"open_issues_count":60,"forks_count":414,"subscribers_count":37,"default_branch":"main","last_synced_at":"2025-05-07T10:52:51.566Z","etag":null,"topics":["agent","chatbot","distributed-agents","drag-and-drop","gpt-4","gpt-4o","large-language-models","llama3","llm","llm-agent","mcp","multi-agent","multi-modal"],"latest_commit_sha":null,"homepage":"https://doc.agentscope.io/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/modelscope.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":"docs/ROADMAP.md","authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-01-12T03:41:59.000Z","updated_at":"2025-05-07T09:34:38.000Z","dependencies_parsed_at":"2024-01-12T17:59:01.515Z","dependency_job_id":"13d25567-8d6b-4a7b-a905-1aadc4bf499d","html_url":"https://github.com/modelscope/agentscope","commit_stats":null,"previous_names":["alibaba/agentscope","modelscope/agentscope"],"tags_count":10,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/modelscope%2Fagentscope","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/modelscope%2Fagentscope/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/modelscope%2Fagentscope/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/modelscope%2Fagentscope/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/modelscope","download_url":"https://codeload.github.com/modelscope/agentscope/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254129481,"owners_count":22019628,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["agent","chatbot","distributed-agents","drag-and-drop","gpt-4","gpt-4o","large-language-models","llama3","llm","llm-agent","mcp","multi-agent","multi-modal"],"created_at":"2024-08-01T15:04:32.252Z","updated_at":"2025-05-14T11:08:27.679Z","avatar_url":"https://github.com/modelscope.png","language":"Python","readme":"English | [**中文**](https://github.com/modelscope/agentscope/blob/main/README_ZH.md) | [**日本語**](https://github.com/modelscope/agentscope/blob/main/README_JA.md)\n\n\u003ca href=\"https://trendshift.io/repositories/10079\" target=\"_blank\"\u003e\u003cimg src=\"https://trendshift.io/api/badge/repositories/10079\" alt=\"modelscope%2Fagentscope | Trendshift\" style=\"width: 250px; height: 55px;\" width=\"250\" height=\"55\"/\u003e\u003c/a\u003e\n\n# AgentScope\n\n\u003ch1 align=\"left\"\u003e\n\u003cimg src=\"https://img.alicdn.com/imgextra/i2/O1CN01cdjhVE1wwt5Auv7bY_!!6000000006373-0-tps-1792-1024.jpg\" width=\"600\" alt=\"agentscope-logo\"\u003e\n\u003c/h1\u003e\n\nStart building LLM-empowered multi-agent applications in an easier way.\n\n[![](https://img.shields.io/badge/cs.MA-2402.14034-B31C1C?logo=arxiv\u0026logoColor=B31C1C)](https://arxiv.org/abs/2402.14034)\n[![](https://img.shields.io/badge/python-3.9+-blue)](https://pypi.org/project/agentscope/)\n[![](https://img.shields.io/badge/pypi-v0.1.3-blue?logo=pypi)](https://pypi.org/project/agentscope/)\n[![](https://img.shields.io/badge/Docs-English%7C%E4%B8%AD%E6%96%87-blue?logo=markdown)](https://modelscope.github.io/agentscope/#welcome-to-agentscope-tutorial-hub)\n[![](https://img.shields.io/badge/Docs-API_Reference-blue?logo=markdown)](https://modelscope.github.io/agentscope/)\n[![](https://img.shields.io/badge/Docs-Roadmap-blue?logo=markdown)](https://github.com/modelscope/agentscope/blob/main/docs/ROADMAP.md)\n\n[![](https://img.shields.io/badge/Drag_and_drop_UI-WorkStation-blue?logo=html5\u0026logoColor=green\u0026color=dark-green)](https://agentscope.io/)\n[![](https://img.shields.io/badge/license-Apache--2.0-black)](./LICENSE)\n[![](https://img.shields.io/badge/Contribute-Welcome-green)](https://modelscope.github.io/agentscope/tutorial/contribute.html)\n\n- If you find our work helpful, please kindly cite [our paper](https://arxiv.org/abs/2402.14034).\n\n- Visit our [workstation](https://agentscope.io/) to build multi-agent applications with dragging-and-dropping.\n\n\u003ch5 align=\"left\"\u003e\n  \u003ca href=\"https://agentscope.io\" target=\"_blank\"\u003e\n    \u003cimg src=\"https://img.alicdn.com/imgextra/i1/O1CN01RXAVVn1zUtjXVvuqS_!!6000000006718-1-tps-3116-1852.gif\" width=\"500\" alt=\"agentscope-workstation\" style=\"box-shadow: 5px 10px 18px #888888;\"\u003e\n  \u003c/a\u003e\n\u003c/h5\u003e\n\n- Welcome to join our community on\n\n| [Discord](https://discord.gg/eYMpfnkG8h)                                                                                         | DingTalk                                                                                                                          |\n|----------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------|\n| \u003cimg src=\"https://gw.alicdn.com/imgextra/i1/O1CN01hhD1mu1Dd3BWVUvxN_!!6000000000238-2-tps-400-400.png\" width=\"100\" height=\"100\"\u003e | \u003cimg src=\"https://img.alicdn.com/imgextra/i1/O1CN01LxzZha1thpIN2cc2E_!!6000000005934-2-tps-497-477.png\" width=\"100\" height=\"100\"\u003e |\n\n----\n\n## News\n- \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e**[2025-04-27]** A new AgentScope Studio is online now. Refer [here](https://doc.agentscope.io/build_tutorial/visual.html) for more details.\n\n- \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e**[2025-03-21]** AgentScope supports hooks functions now. Refer to our [tutorial](https://doc.agentscope.io/build_tutorial/hook.html) for more details.\n\n- \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e**[2025-03-19]** AgentScope supports tools API now. Refer to our [tutorial](https://doc.agentscope.io/build_tutorial/tool.html).\n\n- \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e**[2025-03-20]** Agentscope now supports [MCP Server](https://github.com/modelcontextprotocol/servers)! You can learn how to use it by following this [tutorial](https://doc.agentscope.io/build_tutorial/MCP.html).\n\n- \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e**[2025-03-05]** Our [multi-source RAG Application](applications/multisource_rag_app/README.md) (the chatbot used in our Q\u0026A DingTalk group) is open-source now!\n\n- \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e**[2025-02-24]** [Chinese version tutorial](https://doc.agentscope.io/zh_CN) is online now!\n\n- \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e**[2025-02-13]** We have release the [technical report](https://doc.agentscope.io/tutorial/swe.html) of our solution in [SWE-Bench(Verified)](https://www.swebench.com/)!\n\n- \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e**[2025-02-07]** 🎉 AgentScope has achieved a **63.4% resolve rate** in [SWE-Bench(Verified)](https://www.swebench.com/). More details about our solution are coming soon!\n\n- **[2025-01-04]** AgentScope supports Anthropic API now.\n\n- **[2024-12-12]** We have updated the [roadmap of AgentScope](https://github.com/modelscope/agentscope/blob/main/docs/ROADMAP.md).\n\n- **[2024-09-06]** AgentScope version 0.1.0 is released now.\n\n- **[2024-09-03]** AgentScope supports **Web Browser Control** now! Refer to our [example](https://github.com/modelscope/agentscope/tree/main/examples/conversation_with_web_browser_agent) for more details.\n\n\u003ch5 align=\"left\"\u003e\n\u003cvideo src=\"https://github.com/user-attachments/assets/6d03caab-6193-4ac6-8b1c-36f152ec02ec\" width=\"45%\" alt=\"web browser control\" controls\u003e\u003c/video\u003e\n\u003c/h5\u003e\n\nFor older news and updates, check our \u003ca href=\"https://github.com/modelscope/agentscope/blob/main/docs/news_en.md\"\u003eOld News\u003c/a\u003e\n\n---\n\n## What's AgentScope?\n\nAgentScope is an innovative multi-agent platform designed to empower developers\nto build multi-agent applications with large-scale models.\nIt features three high-level capabilities:\n\n- 🤝 **Easy-to-Use**: Designed for developers, with [fruitful components](https://doc.agentscope.io/build_tutorial/tool.html#),\n[comprehensive documentation](https://doc.agentscope.io/), and broad compatibility. Besides, [AgentScope Workstation](https://agentscope.io/) provides a *drag-and-drop programming platform* and a *copilot* for beginners of AgentScope!\n\n- ✅ **High Robustness**: Supporting customized fault-tolerance controls and\nretry mechanisms to enhance application stability.\n\n- 🚀 **Actor-Based Distribution**: Building distributed multi-agent\napplications in a centralized programming manner for streamlined development.\n\n**Supported Model Libraries**\n\nAgentScope provides a list of `ModelWrapper` to support both local model\nservices and third-party model APIs.\n\n| API                    | Task            | Model Wrapper                                                                                                                   | Configuration                                                                                                                                                                                                                           | Some Supported Models                                           |\n|------------------------|-----------------|---------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------|\n| OpenAI API             | Chat            | [`OpenAIChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/openai_model.py)                 | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/openai_chat_template.json)                 | gpt-4o, gpt-4, gpt-3.5-turbo, ...                               |\n|                        | Embedding       | [`OpenAIEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/openai_model.py)            | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/openai_embedding_template.json)             | text-embedding-ada-002, ...                                     |\n|                        | DALL·E          | [`OpenAIDALLEWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/openai_model.py)                | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/openai_dall_e_template.json)                | dall-e-2, dall-e-3                                              |\n| DashScope API          | Chat            | [`DashScopeChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/dashscope_model.py)           | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/dashscope_chat_template.json)            | qwen-plus, qwen-max, ...                                        |\n|                        | Image Synthesis | [`DashScopeImageSynthesisWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/dashscope_model.py) | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/dashscope_image_synthesis_template.json) | wanx-v1                                                         |\n|                        | Text Embedding  | [`DashScopeTextEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/dashscope_model.py)  | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/dashscope_text_embedding_template.json)  | text-embedding-v1, text-embedding-v2, ...                       |\n|                        | Multimodal      | [`DashScopeMultiModalWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/dashscope_model.py)     | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/dashscope_multimodal_template.json)      | qwen-vl-max, qwen-vl-chat-v1, qwen-audio-chat                   |\n| Gemini API             | Chat            | [`GeminiChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py)                 | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/gemini_chat_template.json)                  | gemini-pro, ...                                                 |\n|                        | Embedding       | [`GeminiEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/gemini_model.py)            | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/gemini_embedding_template.json)             | models/embedding-001, ...                                       |\n| ZhipuAI API            | Chat            | [`ZhipuAIChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py)                 | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/zhipu_chat_template.json)                   | glm-4, ...                                                      |\n|                        | Embedding       | [`ZhipuAIEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/zhipu_model.py)            | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/zhipu_embedding_template.json)              | embedding-2, ...                                                |\n| ollama                 | Chat            | [`OllamaChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py)                 | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_chat_template.json)                  | llama3, llama2, Mistral, ...                                    |\n|                        | Embedding       | [`OllamaEmbeddingWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py)            | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_embedding_template.json)             | llama2, Mistral, ...                                            |\n|                        | Generation      | [`OllamaGenerationWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/ollama_model.py)           | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/ollama_generate_template.json)              | llama2, Mistral, ...                                            |\n| LiteLLM API            | Chat            | [`LiteLLMChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/litellm_model.py)               | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/litellm_chat_template.json)                | [models supported by litellm](https://docs.litellm.ai/docs/)... |\n| Yi API                 | Chat            | [`YiChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/yi_model.py)                         | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/yi_chat_template.json)                | yi-large, yi-medium, ...                                        |\n| Post Request based API | -               | [`PostAPIModelWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/post_model.py)                 | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/postapi_model_config_template.json)   | -                                                               |\n| Anthropic API          | Chat            | [`AnthropicChatWrapper`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/models/anthropic_model.py)           | [template](https://github.com/modelscope/agentscope/blob/main/examples/model_configs_template/anthropic_chat_model_config_template.json) | claude-3-5-sonnet-20241022, ... |\n\n**Supported Local Model Deployment**\n\nAgentScope enables developers to rapidly deploy local model services using\nthe following libraries.\n\n- [ollama (CPU inference)](https://github.com/modelscope/agentscope/blob/main/scripts/README.md#ollama)\n- [Flask + Transformers](https://github.com/modelscope/agentscope/blob/main/scripts/README.md#with-transformers-library)\n- [Flask + ModelScope](https://github.com/modelscope/agentscope/blob/main/scripts/README.md#with-modelscope-library)\n- [FastChat](https://github.com/modelscope/agentscope/blob/main/scripts/README.md#fastchat)\n- [vllm](https://github.com/modelscope/agentscope/blob/main/scripts/README.md#vllm)\n\n**Supported Services**\n\n- Web Search\n- Data Query\n- Retrieval\n- Code Execution\n- File Operation\n- Text Processing\n- Multi Modality\n- Wikipedia Search and Retrieval\n- TripAdvisor Search\n- Web Browser Control\n\n**Example Applications**\n\n- Model\n  - [Using Llama3 in AgentScope](https://github.com/modelscope/agentscope/blob/main/examples/model_llama3)\n\n- Conversation\n  - [Basic Conversation](https://github.com/modelscope/agentscope/blob/main/examples/conversation_basic)\n  - [Autonomous Conversation with Mentions](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_mentions)\n  - [Self-Organizing Conversation](https://github.com/modelscope/agentscope/blob/main/examples/conversation_self_organizing)\n  - [Basic Conversation with LangChain library](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_langchain)\n  - [Conversation with ReAct Agent](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_react_agent)\n  - [Conversation in Natural Language to Query SQL](https://github.com/modelscope/agentscope/blob/main/examples/conversation_nl2sql/)\n  - [Conversation with RAG Agent](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_RAG_agents)\n  - [Conversation with gpt-4o](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_gpt-4o)\n  - [Conversation with Software Engineering Agent](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_swe-agent/)\n  - [Conversation with Customized Tools](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_customized_services/)\n  - \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e[Mixture of Agents Algorithm](https://github.com/modelscope/agentscope/blob/main/examples/conversation_mixture_of_agents/)\n  - \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e[Conversation in Stream Mode](https://github.com/modelscope/agentscope/blob/main/examples/conversation_in_stream_mode/)\n  - \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e[Conversation with CodeAct Agent](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_codeact_agent/)\n  - \u003cimg src=\"https://img.alicdn.com/imgextra/i3/O1CN01SFL0Gu26nrQBFKXFR_!!6000000007707-2-tps-500-500.png\" alt=\"new\" width=\"30\" height=\"30\"/\u003e[Conversation with Router Agent](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_router_agent/)\n\n- Game\n  - [Gomoku](https://github.com/modelscope/agentscope/blob/main/examples/game_gomoku)\n  - [Werewolf](https://github.com/modelscope/agentscope/blob/main/examples/game_werewolf)\n\n- Distribution\n  - [Distributed Conversation](https://github.com/modelscope/agentscope/blob/main/examples/distributed_conversation)\n  - [Distributed Debate](https://github.com/modelscope/agentscope/blob/main/examples/distributed_debate)\n  - [Distributed Parallel Optimization](https://github.com/modelscope/agentscope/blob/main/examples/distributed_parallel_optimization)\n  - [Distributed Large Scale Simulation](https://github.com/modelscope/agentscope/blob/main/examples/distributed_simulation)\n\nMore models, services and examples are coming soon!\n\n## Installation\n\nAgentScope requires **Python 3.9** or higher.\n\n***Note: This project is currently in active development, it's recommended to\ninstall AgentScope from source.***\n\n### From source\n\n- Install AgentScope in editable mode:\n\n```bash\n# Pull the source code from GitHub\ngit clone https://github.com/modelscope/agentscope.git\n\n# Install the package in editable mode\ncd agentscope\npip install -e .\n```\n\n### Using pip\n\n- Install AgentScope from pip:\n\n```bash\npip install agentscope\n```\n\n### Extra Dependencies\n\nTo support different deployment scenarios, AgentScope provides several\noptional dependencies. Full list of optional dependencies refers to\n[tutorial](https://doc.agentscope.io/build_tutorial/quickstart.html)\nTaking distribution mode as an example, you can install its dependencies\nas follows:\n\n#### On Windows\n\n```bash\n# From source\npip install -e .[distribute]\n# From pypi\npip install agentscope[distribute]\n```\n\n#### On Mac \u0026 Linux\n\n```bash\n# From source\npip install -e .\\[distribute\\]\n# From pypi\npip install agentscope\\[distribute\\]\n```\n\n## Quick Start\n\n### Configuration\n\nIn AgentScope, the model deployment and invocation are decoupled by\n`ModelWrapper`.\n\nTo use these model wrappers, you need to prepare a model config file as\nfollows.\n\n```python\nmodel_config = {\n    # The identifies of your config and used model wrapper\n    \"config_name\": \"{your_config_name}\",          # The name to identify the config\n    \"model_type\": \"{model_type}\",                 # The type to identify the model wrapper\n\n    # Detailed parameters into initialize the model wrapper\n    # ...\n}\n```\n\nTaking OpenAI Chat API as an example, the model configuration is as follows:\n\n```python\nopenai_model_config = {\n    \"config_name\": \"my_openai_config\",             # The name to identify the config\n    \"model_type\": \"openai_chat\",                   # The type to identify the model wrapper\n\n    # Detailed parameters into initialize the model wrapper\n    \"model_name\": \"gpt-4\",                         # The used model in openai API, e.g. gpt-4, gpt-3.5-turbo, etc.\n    \"api_key\": \"xxx\",                              # The API key for OpenAI API. If not set, env\n                                                   # variable OPENAI_API_KEY will be used.\n    \"organization\": \"xxx\",                         # The organization for OpenAI API. If not set, env\n                                                   # variable OPENAI_ORGANIZATION will be used.\n}\n```\n\nMore details about how to set up local model services and prepare model\nconfigurations is in our\n[tutorial](https://modelscope.github.io/agentscope/index.html#welcome-to-agentscope-tutorial-hub).\n\n### Create Agents\n\nCreate built-in user and assistant agents as follows.\n\n```python\nfrom agentscope.agents import DialogAgent, UserAgent\nimport agentscope\n\n# Load model configs\nagentscope.init(model_configs=\"./model_configs.json\")\n\n# Create a dialog agent and a user agent\ndialog_agent = DialogAgent(name=\"assistant\",\n                           model_config_name=\"my_openai_config\")\nuser_agent = UserAgent()\n```\n\n### Construct Conversation\n\nIn AgentScope, **message** is the bridge among agents, which is a\n**dict** that contains two necessary fields `name` and `content` and an\noptional field `url` to local files (image, video or audio) or website.\n\n```python\nfrom agentscope.message import Msg\n\nx = Msg(name=\"Alice\", content=\"Hi!\")\nx = Msg(\"Bob\", \"What about this picture I took?\", url=\"/path/to/picture.jpg\")\n```\n\nStart a conversation between two agents (e.g. dialog_agent and user_agent)\nwith the following code:\n\n```python\nx = None\nwhile True:\n    x = dialog_agent(x)\n    x = user_agent(x)\n    if x.content == \"exit\":  # user input \"exit\" to exit the conversation_basic\n        break\n```\n\n### AgentScope Studio\n\nAgentScope provides an easy-to-use runtime user interface capable of\ndisplaying multimodal output on the front end, including text, images,\naudio and video.\n\nRefer to our [tutorial](https://doc.agentscope.io/build_tutorial/visual.html) for more details.\n\n\u003ch5 align=\"center\"\u003e\n\u003cimg src=\"https://img.alicdn.com/imgextra/i4/O1CN015kjnkd1xdwJoNxqLZ_!!6000000006467-0-tps-3452-1984.jpg\" width=\"600\" alt=\"agentscope-logo\"\u003e\n\u003c/h5\u003e\n\n## License\n\nAgentScope is released under Apache License 2.0.\n\n## Contributing\n\nContributions are always welcomed!\n\nWe provide a developer version with additional pre-commit hooks to perform\nchecks compared to the official version:\n\n```bash\n# For windows\npip install -e .[dev]\n# For mac\npip install -e .\\[dev\\]\n\n# Install pre-commit hooks\npre-commit install\n```\n\nPlease refer to our [Contribution Guide](https://modelscope.github.io/agentscope/en/tutorial/302-contribute.html) for more details.\n\n## Publications\n\nIf you find our work helpful for your research or application, please cite our papers.\n\n1. [AgentScope: A Flexible yet Robust Multi-Agent Platform](https://arxiv.org/abs/2402.14034)\n\n    ```\n    @article{agentscope,\n        author  = {Dawei Gao and\n                   Zitao Li and\n                   Xuchen Pan and\n                   Weirui Kuang and\n                   Zhijian Ma and\n                   Bingchen Qian and\n                   Fei Wei and\n                   Wenhao Zhang and\n                   Yuexiang Xie and\n                   Daoyuan Chen and\n                   Liuyi Yao and\n                   Hongyi Peng and\n                   Ze Yu Zhang and\n                   Lin Zhu and\n                   Chen Cheng and\n                   Hongzhu Shi and\n                   Yaliang Li and\n                   Bolin Ding and\n                   Jingren Zhou}\n        title   = {AgentScope: A Flexible yet Robust Multi-Agent Platform},\n        journal = {CoRR},\n        volume  = {abs/2402.14034},\n        year    = {2024},\n    }\n    ```\n\n## Contributors ✨\n\nAll thanks to our contributors:\n\n\u003ca href=\"https://github.com/modelscope/agentscope/graphs/contributors\"\u003e\n  \u003cimg src=\"https://contrib.rocks/image?repo=modelscope/agentscope\u0026max=999\u0026columns=12\u0026anon=1\" /\u003e\n\u003c/a\u003e","funding_links":[],"categories":["Agentic Frameworks","Python","\u003cspan id=\"game\"\u003eGame (World Model \u0026 Agent)\u003c/span\u003e","LLM Application / RAG","A01_文本生成_文本对话","Image Generation \u0026 Editing","Frameworks","Agentic Framework","智能体 Agents","Agents","🧰 Frameworks for Agentic AI","其他LLM框架","🧱 Agent Frameworks","Repos","Other LLM Frameworks","Table of Open-Source AI Agents Projects","AI Agent Frameworks \u0026 SDKs","MCP Servers","Building","Multi-agent frameworks","Multi-Agent Orchestration"],"sub_categories":["\u003cspan id=\"tool\"\u003eLLM (LLM \u0026 Tool)\u003c/span\u003e","大语言对话模型及数据","5. **Multi-Agent Systems**","文章","Multi-Agent Orchestration","Videos Playlists","Multi-Agent Collaboration Systems","Agent Platforms","Frameworks"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmodelscope%2Fagentscope","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmodelscope%2Fagentscope","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmodelscope%2Fagentscope/lists"}