{"id":35797455,"url":"https://github.com/teabranch/open-responses-server","last_synced_at":"2026-04-04T10:51:00.772Z","repository":{"id":288511343,"uuid":"968351717","full_name":"teabranch/open-responses-server","owner":"teabranch","description":"Wraps any OpenAI API interface as Responses with MCPs support so it supports Codex. Adding any missing stateful features. Ollama and Vllm compliant.","archived":false,"fork":false,"pushed_at":"2025-11-05T23:17:23.000Z","size":489,"stargazers_count":153,"open_issues_count":11,"forks_count":20,"subscribers_count":5,"default_branch":"main","last_synced_at":"2026-03-29T14:49:49.990Z","etag":null,"topics":["ai","codex","generative-ai","mcp","mcp-client","openai","openai-api","openai-codex","openai-codex-cli","openai-codex-integration","responses-api"],"latest_commit_sha":null,"homepage":"https://teabranch.github.io/open-responses-server/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/teabranch.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"docs/SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2025-04-18T00:01:35.000Z","updated_at":"2026-03-13T16:08:22.000Z","dependencies_parsed_at":"2025-06-09T22:23:08.211Z","dependency_job_id":"d57128d1-43ef-4a45-9d90-5abc27ae1df6","html_url":"https://github.com/teabranch/open-responses-server","commit_stats":null,"previous_names":["orinachum/openai-to-codex-wrapper","teabranch/openai-to-codex-wrapper","teabranch/openai-responses-server","teabranch/open-responses-server"],"tags_count":23,"template":false,"template_full_name":null,"purl":"pkg:github/teabranch/open-responses-server","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/teabranch%2Fopen-responses-server","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/teabranch%2Fopen-responses-server/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/teabranch%2Fopen-responses-server/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/teabranch%2Fopen-responses-server/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/teabranch","download_url":"https://codeload.github.com/teabranch/open-responses-server/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/teabranch%2Fopen-responses-server/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31397055,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-04T10:20:44.708Z","status":"ssl_error","status_checked_at":"2026-04-04T10:20:06.846Z","response_time":60,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","codex","generative-ai","mcp","mcp-client","openai","openai-api","openai-codex","openai-codex-cli","openai-codex-integration","responses-api"],"created_at":"2026-01-07T10:00:54.764Z","updated_at":"2026-04-04T10:51:00.768Z","avatar_url":"https://github.com/teabranch.png","language":"Python","readme":"# 🚀 open-responses-server\n\nA plug-and-play server that speaks OpenAI’s Responses API—no matter which AI backend you’re running.  \n\nOllama? vLLM? LiteLLM? Even OpenAI itself?  \nThis server bridges them all to the OpenAI ChatCompletions \u0026 Responses API interface.  \n\nIn plain words:  \n👉 Want to run OpenAI’s Coding Assistant (Codex) or other OpenAI API clients against your own models?  \n👉 Want to experiment with self-hosted LLMs but keep OpenAI’s API compatibility?  \n\nThis project makes it happen.  \nIt handles stateful chat, tool calls, and future features like file search \u0026 code interpreter—all behind a familiar OpenAI API.\n\n⸻\n\n# ✨ Why use this?\n\n✅ Acts as a drop-in replacement for OpenAI’s Responses API.  \n✅ Lets you run any backend AI (Ollama, vLLM, Groq, etc.) with OpenAI-compatible clients.  \n✅ MCP support around both Chat Completions and Responses APIs\n✅ Supports OpenAI’s new Coding Assistant / Codex that requires Responses API.  \n✅ Built for innovators, researchers, OSS enthusiasts.  \n✅ Enterprise-ready: scalable, reliable, and secure for production workloads.\n\n⸻\n\n🔥 What’s in \u0026 what’s next?\n\n✅ Done\t📝 Coming soon\n- ✅ Tool call support\t.env file support\n- ✅ Manual \u0026 pipeline tests\n- ✅ Docker image build\n- ✅ PyPI release\t\n- 📝 Persistent state (not just in-memory)\n- ✅ CLI validation\t\n- 📝 hosted tools:\n  - ✅ MCPs support\n  - 📝 Web search: crawl4ai\n  - 📝 File upload + search: graphiti\n  - 📝 Code interpreter\n  - 📝 Computer use APIs\n\n⸻\n\n# 🏗️ Quick Install\n\nLatest release on PyPI:\n\n```\npip install open-responses-server\n```\n\nOr install from source:\n\n```\npip install uv\nuv venv\nuv pip install .\nuv pip install -e \".[dev]\"  # dev dependencies\n```\n\nRun the server:\n\n```\n# Using CLI tool (after installation)\notc start\n\n# Or directly from source\nuv run src/open_responses_server/cli.py start\n```\n\nDocker deployment:\n\n```\n# Run with Docker\ndocker run -p 8080:8080 \\\n  -e OPENAI_BASE_URL_INTERNAL=http://your-llm-api:8000 \\\n  -e OPENAI_BASE_URL=http://localhost:8080 \\\n  -e OPENAI_API_KEY=your-api-key \\\n  ghcr.io/teabranch/open-responses-server:latest\n```\n\nDocker images are available for linux/amd64, linux/arm64, and linux/arm/v7 architectures.\nWorks great with docker-compose.yaml for Codex + your own model.\n\n⸻\n\n# 🛠️ Configure\n\nMinimal config to connect your AI backend:\n\n```\nOPENAI_BASE_URL_INTERNAL=http://localhost:8000   # Your LLM backend (Ollama typically on :11434, vLLM on :8000)\nOPENAI_BASE_URL=http://localhost:8080            # This server's endpoint\nOPENAI_API_KEY=sk-mockapikey123456789            # Mock key tunneled to backend\nMCP_SERVERS_CONFIG_PATH=./mcps.json              # Path to mcps servers json file \n```\n\nServer binding:\n```\nAPI_ADAPTER_HOST=0.0.0.0\nAPI_ADAPTER_PORT=8080\n```\nStreaming and connection:\n```\nSTREAM_TIMEOUT=120.0                # HTTP timeout (seconds) for streaming requests\nHEARTBEAT_INTERVAL=15.0             # SSE keepalive interval (seconds)\n```\nConversation and tool handling:\n```\nMAX_CONVERSATION_HISTORY=100        # Max stored conversation entries\nMAX_TOOL_CALL_ITERATIONS=25         # Max tool-call loop iterations\nMCP_TOOL_REFRESH_INTERVAL=10        # Seconds between MCP tool cache refreshes\n```\nLogging:\n```\nLOG_LEVEL=INFO                      # DEBUG, INFO, WARNING, ERROR, CRITICAL\nLOG_FILE_PATH=./log/api_adapter.log # Path to log file\n```\n\nConfigure with CLI tool:\n```\n# Interactive configuration setup\notc configure\n```\n\nVerify setup:\n```\n# Check if the server is working\ncurl http://localhost:8080/v1/models\n```\n\n⸻\n\n# 💬 We’d love your support!\n\nIf you think this is cool:  \n⭐ Star the repo.  \n🐛 Open an issue if something’s broken.  \n🤝 Suggest a feature or submit a pull request!  \n\nThis is early-stage but already usable in real-world demos.  \nLet’s build something powerful—together.\n\n\n## Star History\n\n[![Star History Chart](https://api.star-history.com/svg?repos=TeaBranch/open-responses-server\u0026type=Date)](https://www.star-history.com/#TeaBranch/open-responses-server\u0026Date)\n\n# Projects using this middleware\n- [Agentic Developer MCP Server](https://github.com/teabranch/agentic-developer-mcp) - a wrapper around Codex, transforming Codex into an agentic developer node over a folder. Together with this (ORS) repo, it becomes a link in a tree/chain of developers. \n- [Nvidia jetson devices](https://github.com/OriNachum/autonomous-intelligence/tree/main/local-codex) - docker compose with ollama\n\n⸻\n\n# 📚 Citations \u0026 inspirations\n\n## Referenced projects\n- [SearXNG MCP](https://github.com/ihor-sokoliuk/mcp-searxng)\n- UncleCode. (2024). Crawl4AI: Open-source LLM Friendly Web Crawler \u0026 Scraper [Computer software]. GitHub. [Crawl4AI repo](https://github.com/unclecode/crawl4ai)\n\n## Cite this project\n\n### Code citation\n```\n@software{open-responses-server,\n  author = {TeaBranch},\n  title = {open-responses-server: Open-source server bridging any AI provider to OpenAI’s Responses API},\n  year = {2025},\n  publisher = {GitHub},\n  journal = {GitHub Repository},\n  howpublished = {\\url{https://github.com/teabranch/open-responses-server}},\n  commit = {use the commit hash you’re working with}\n}\n```\n\n### Text citation\n\nTeaBranch. (2025). open-responses-server: Open-source server the serves any AI provider with OpenAI ChatCompletions as OpenAI's Responses API and hosted tools. [Computer software]. GitHub. https://github.com/teabranch/open-responses-server\n\n# Links:\n- [Python library](https://pypi.org/project/open-responses-server)\n- [GitHub repository](https://github.com/teabranch/open-responses-server)\n- [GitHub Pages](https://teabranch.github.io/open-responses-server)\n\n# Naming history\nThis repo had changed names:\n- openai-responses-server (Changed to avoid brand name OpenAI)\n- open-responses-server \n\n","funding_links":[],"categories":["📚 Projects (2474 total)"],"sub_categories":["MCP Servers"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fteabranch%2Fopen-responses-server","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fteabranch%2Fopen-responses-server","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fteabranch%2Fopen-responses-server/lists"}