https://github.com/conechoai/nchan-mcp-transport
The best way to deploy mcp server. A high-performance WebSocket/SSE transport layer & gateway for Anthropic's MCP (Model Context Protocol) β powered by Nginx, Nchan, and FastAPI.
https://github.com/conechoai/nchan-mcp-transport
actions claude-plugin-backend fastapi-websocket-mcp gpts gpts-actions mcp-jsonrpc-gateway mcp-openapi-bridge mcp-pubsub mcp-transport nchan-websocket real-time-ai-api-gateway sse-for-anthropic streamable-http
Last synced: about 1 month ago
JSON representation
The best way to deploy mcp server. A high-performance WebSocket/SSE transport layer & gateway for Anthropic's MCP (Model Context Protocol) β powered by Nginx, Nchan, and FastAPI.
- Host: GitHub
- URL: https://github.com/conechoai/nchan-mcp-transport
- Owner: ConechoAI
- License: mit
- Created: 2025-03-10T06:41:54.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-04-07T05:20:45.000Z (2 months ago)
- Last Synced: 2025-04-30T18:49:11.784Z (about 2 months ago)
- Topics: actions, claude-plugin-backend, fastapi-websocket-mcp, gpts, gpts-actions, mcp-jsonrpc-gateway, mcp-openapi-bridge, mcp-pubsub, mcp-transport, nchan-websocket, real-time-ai-api-gateway, sse-for-anthropic, streamable-http
- Language: TypeScript
- Homepage:
- Size: 296 KB
- Stars: 21
- Watchers: 3
- Forks: 3
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# π Nchan MCP Transport
> A high-performance WebSocket/SSE transport layer & gateway for **Anthropic's MCP (Model Context Protocol)** β powered by Nginx, Nchan, and FastAPI.
> For building **real-time, scalable AI integrations** with Claude and other LLM agents.---
## β¨ What is this?
**Nchan MCP Transport** provides a **real-time API gateway** for MCP clients (like Claude) to talk to your tools and services over:
- π§΅ **WebSocket** or **Server-Sent Events (SSE)**
- β‘οΈ **Streamable HTTP** compatible
- π§ Powered by Nginx + Nchan for **low-latency pub/sub**
- π Integrates with FastAPI for backend logic and OpenAPI tooling> β Ideal for AI developers building **Claude plugins**, **LLM agents**, or integrating **external APIs** into Claude via MCP.
---
## π§© Key Features
| Feature | Description |
|----------------------------------|-----------------------------------------------------------------------------|
| π **Dual Protocol Support** | Seamlessly supports **WebSocket** and **SSE** with automatic detection |
| π **High Performance Pub/Sub** | Built on **Nginx + Nchan**, handles thousands of concurrent connections |
| π **MCP-Compliant Transport** | Fully implements **Model Context Protocol** (JSON-RPC 2.0) |
| π§° **OpenAPI Integration** | Auto-generate MCP tools from any OpenAPI spec |
| πͺ **Tool / Resource System** | Use Python decorators to register tools and resources |
| π‘ **Asynchronous Execution** | Background task queue + live progress updates via push notifications |
| π§± **Dockerized Deployment** | Easily spin up with Docker Compose |---
## π§ Why Use This?
MCP lets AI assistants like **Claude** talk to external tools. But:
- Native MCP is **HTTP+SSE**, which struggles with **long tasks**, **network instability**, and **high concurrency**
- WebSockets arenβt natively supported by Claude β this project **bridges the gap**
- Server-side logic in pure Python (like `FastMCP`) may **not scale under load**β **Nchan MCP Transport** gives you:
- Web-scale performance (Nginx/Nchan)
- FastAPI-powered backend for tools
- Real-time event delivery to Claude clients
- Plug-and-play OpenAPI to Claude integration---
## π Quickstart
### π¦ 1. Install server SDK
```bash
pip install httmcp
```### π§ͺ 2. Run demo in Docker
```bash
git clone https://github.com/yourusername/nchan-mcp-transport.git
cd nchan-mcp-transport
docker-compose up -d
```### π 3. Define your tool
```python
@server.tool()
async def search_docs(query: str) -> str:
return f"Searching for {query}..."
```### 𧬠4. Expose OpenAPI service (optional)
```python
openapi_server = await OpenAPIMCP.from_openapi("https://example.com/openapi.json", publish_server="http://nchan:80")
app.include_router(openapi_server.router)
```### π₯οΈ 5. One-Click GPTs Actions to MCP Deployment
HTTMCP provides a powerful CLI for instant deployment of GPTs Actions to MCP servers:
```bash
# Installation
pip install httmcp[cli]# One-click deployment from GPTs Actions OpenAPI spec
python -m httmcp -f gpt_actions_openapi.json -p http://nchan:80
```---
## π Use Cases
- Claude plugin server over WebSocket/SSE
- Real-time LLM agent backend (LangChain/AutoGen style)
- Connect Claude to internal APIs (via OpenAPI)
- High-performance tool/service bridge for MCP---
## π Requirements
- Nginx with Nchan module (pre-installed in Docker image)
- Python 3.9+
- Docker / Docker Compose---
## π Tech Stack
- π§© **Nginx + Nchan** β persistent connection management & pub/sub
- βοΈ **FastAPI** β backend logic & JSON-RPC routing
- π **HTTMCP SDK** β full MCP protocol implementation
- π³ **Docker** β deployment ready---
## π Keywords
`mcp transport`, `nchan websocket`, `sse for anthropic`, `mcp jsonrpc gateway`, `claude plugin backend`, `streamable http`, `real-time ai api gateway`, `fastapi websocket mcp`, `mcp pubsub`, `mcp openapi bridge`
---
## π€ Contributing
Pull requests are welcome! File issues if youβd like to help improve:
- Performance
- Deployment
- SDK integrations---
## π License
MIT License