{"id":29840812,"url":"https://github.com/subbyte/mcp-streamable-http-quickstart","last_synced_at":"2025-07-29T14:32:54.894Z","repository":{"id":297133958,"uuid":"995751429","full_name":"subbyte/mcp-streamable-http-quickstart","owner":"subbyte","description":"SSE and Streamable HTTP Extension of MCP Quickstart Weather App","archived":false,"fork":false,"pushed_at":"2025-06-04T03:17:12.000Z","size":18,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2025-06-04T08:41:09.100Z","etag":null,"topics":["mcp","openai-api","sse","streamable-http"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/subbyte.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-06-04T00:46:28.000Z","updated_at":"2025-06-04T03:19:30.000Z","dependencies_parsed_at":"2025-06-04T08:41:12.664Z","dependency_job_id":"c28a5723-b95a-459f-8e93-657dd73f8567","html_url":"https://github.com/subbyte/mcp-streamable-http-quickstart","commit_stats":null,"previous_names":["subbyte/mcp-streamable-http-quickstart"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/subbyte/mcp-streamable-http-quickstart","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/subbyte%2Fmcp-streamable-http-quickstart","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/subbyte%2Fmcp-streamable-http-quickstart/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/subbyte%2Fmcp-streamable-http-quickstart/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/subbyte%2Fmcp-streamable-http-quickstart/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/subbyte","download_url":"https://codeload.github.com/subbyte/mcp-streamable-http-quickstart/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/subbyte%2Fmcp-streamable-http-quickstart/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":267703081,"owners_count":24130464,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-07-29T02:00:12.549Z","response_time":2574,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["mcp","openai-api","sse","streamable-http"],"created_at":"2025-07-29T14:32:43.095Z","updated_at":"2025-07-29T14:32:54.863Z","avatar_url":"https://github.com/subbyte.png","language":"Python","readme":"# MCP Quickstart Weather App Extensions\n\nAfter reading the Official MCP quickstart examples on MCP [server](https://modelcontextprotocol.io/quickstart/server) and [client](https://modelcontextprotocol.io/quickstart/client), do you wonder\n- How to upgrade the simple stdio-based example to HTTP server/client towards real-world uses?\n  - The [latest MCP document (June 2025)](https://modelcontextprotocol.io/docs/concepts/transports) lists SSE as the default HTTP transport protocol\n  - The [latest MCP specification (March 2025)](https://modelcontextprotocol.io/specification/2025-03-26/basic/transports) further upgrades SSE to Streamable HTTP protocol\n- How to replace the Anthropic API with OpenAI API widely used in open source inference servers like [vllm](https://docs.vllm.ai/en/latest/)?\n\n## Goal of This Repository\n\n1. Patch the official MCP quickstart weather app to use:\n    - SSE or Streamable HTTP as the transport protocol between client and server\n    - OpenAI API for LLM calls\n2. Explain each modification for readers to understand these extensions\n\n## How to Run\n\n1. Install [uv](https://docs.astral.sh)\n2. Choose the protocol in your mind, either `sse` or `streamable-http`\n3. Open two terminals on one host (hardcoded localhost HTTP server in this example)\n4. Term 1: run server\n    - Go to the server directory `weather-server-python`\n    - Start the server `uv run server PROTOCOL_OF_YOUR_CHOICE`\n5. Term 2: run client\n    - Go to the client directory `mcp-client-python`\n    - Setup environment variables for OpenAI endpoint and API\n        - `export OPENAI_BASE_URL=http://xxx/v1`\n        - `export OPENAI_API_KEY=yyy`\n    - Start the client `uv run client PROTOCOL_OF_YOUR_CHOICE`\n\n## Explanation of Modifications\n\n### Use SSE/Streamable-HTTP Instead of Stdio for Transport Protocol\n\n- Server: use `mcp.run('sys.argv[1]')` instead of `mcp.run('stdio')` given `sys.argv[1]` is either `sse` or `streamable-http`\n    - SSE protocol: server main endpoint is `http://localhost:8000/sse`\n    - Streamable HTTP protocol: server only endpoint is `http://localhost:8000/mcp`\n- Client: load `rs` (readstream), `ws` (writestream) from `sse_client` or `streamablehttp_client` intead of `stdio_client` in the original MCP quickstart example\n    - [sse_client awaited return](https://github.com/modelcontextprotocol/python-sdk/blob/main/src/mcp/client/sse.py#L155)\n    - [streamablehttp_client awaited return](https://github.com/modelcontextprotocol/python-sdk/blob/main/src/mcp/client/streamable_http.py#L492)\n\n### Swap Anthropic API to OpenAI API for LLM call\n\n- Replace the LLM call function\n    - `self.anthropic.messages.create()` -\u003e `self.client.chat.completions.create()`\n    - Dynamic model id for vllm\n    - The `tools` argument uses a little different formatting\n- Replace the LLM response object handling\n    - `response` -\u003e `response.choices[0].message`\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsubbyte%2Fmcp-streamable-http-quickstart","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fsubbyte%2Fmcp-streamable-http-quickstart","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsubbyte%2Fmcp-streamable-http-quickstart/lists"}