https://github.com/open-webui/mcpo
A simple, secure MCP-to-OpenAPI proxy server
https://github.com/open-webui/mcpo
mcp mcp-server mcp-to-openapi open-webui openapi
Last synced: 8 days ago
JSON representation
A simple, secure MCP-to-OpenAPI proxy server
- Host: GitHub
- URL: https://github.com/open-webui/mcpo
- Owner: open-webui
- License: mit
- Created: 2025-03-30T10:03:03.000Z (16 days ago)
- Default Branch: main
- Last Pushed: 2025-04-06T22:05:00.000Z (9 days ago)
- Last Synced: 2025-04-06T23:19:12.318Z (9 days ago)
- Topics: mcp, mcp-server, mcp-to-openapi, open-webui, openapi
- Language: Python
- Homepage: https://docs.openwebui.com/openapi-servers/mcp
- Size: 62.5 KB
- Stars: 634
- Watchers: 12
- Forks: 56
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-LLM-resourses - mcpo
README
# ⚡️ mcpo
Expose any MCP tool as an OpenAPI-compatible HTTP server—instantly.
mcpo is a dead-simple proxy that takes an MCP server command and makes it accessible via standard RESTful OpenAPI, so your tools "just work" with LLM agents and apps expecting OpenAPI servers.
No custom protocol. No glue code. No hassle.
## 🤔 Why Use mcpo Instead of Native MCP?
MCP servers usually speak over raw stdio, which is:
- 🔓 Inherently insecure
- ❌ Incompatible with most tools
- 🧩 Missing standard features like docs, auth, error handling, etc.mcpo solves all of that—without extra effort:
- ✅ Works instantly with OpenAPI tools, SDKs, and UIs
- 🛡 Adds security, stability, and scalability using trusted web standards
- 🧠 Auto-generates interactive docs for every tool, no config needed
- 🔌 Uses pure HTTP—no sockets, no glue code, no surprisesWhat feels like "one more step" is really fewer steps with better outcomes.
mcpo makes your AI tools usable, secure, and interoperable—right now, with zero hassle.
## 🚀 Quick Usage
We recommend using uv for lightning-fast startup and zero config.
```bash
uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command
```Or, if you’re using Python:
```bash
pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command
```Example:
```bash
uvx mcpo --port 8000 --api-key "top-secret" -- uvx mcp-server-time --local-timezone=America/New_York
```That’s it. Your MCP tool is now available at http://localhost:8000 with a generated OpenAPI schema — test it live at [http://localhost:8000/docs](http://localhost:8000/docs).
🤝 **To integrate with Open WebUI after launching the server, check our [docs](https://docs.openwebui.com/openapi-servers/open-webui/).**
### 🔄 Using a Config File
You can serve multiple MCP tools via a single config file that follows the [Claude Desktop](https://modelcontextprotocol.io/quickstart/user) format:
Start via:
```bash
mcpo --config /path/to/config.json
```Example config.json:
```json
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-memory"]
},
"time": {
"command": "uvx",
"args": ["mcp-server-time", "--local-timezone=America/New_York"]
}
}
}
```Each tool will be accessible under its own unique route, e.g.:
- http://localhost:8000/memory
- http://localhost:8000/timeEach with a dedicated OpenAPI schema and proxy handler. Access full schema UI at: `http://localhost:8000//docs` (e.g. /memory/docs, /time/docs)
## 🔧 Requirements
- Python 3.8+
- uv (optional, but highly recommended for performance + packaging)## 🪪 License
MIT
## 🤝 Contributing
We welcome and strongly encourage contributions from the community!
Whether you're fixing a bug, adding features, improving documentation, or just sharing ideas—your input is incredibly valuable and helps make mcpo better for everyone.
Getting started is easy:
- Fork the repo
- Create a new branch
- Make your changes
- Open a pull requestNot sure where to start? Feel free to open an issue or ask a question—we’re happy to help you find a good first task.
✨ Let's build the future of interoperable AI tooling together!