https://github.com/mng-dev-ai/claudex
Your own Claude Code UI, sandbox, in-browser VS Code, terminal, multi-provider support (Anthropic, OpenAI, GitHub Copilot, OpenRouter), custom skills, and MCP servers.
https://github.com/mng-dev-ai/claudex
agent anthropic claude claude-code code-execution fastapi llm python react sandbox typescript
Last synced: 23 days ago
JSON representation
Your own Claude Code UI, sandbox, in-browser VS Code, terminal, multi-provider support (Anthropic, OpenAI, GitHub Copilot, OpenRouter), custom skills, and MCP servers.
- Host: GitHub
- URL: https://github.com/mng-dev-ai/claudex
- Owner: Mng-dev-ai
- License: apache-2.0
- Created: 2025-12-15T00:46:56.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2026-02-24T11:17:30.000Z (about 1 month ago)
- Last Synced: 2026-02-24T16:46:16.435Z (about 1 month ago)
- Topics: agent, anthropic, claude, claude-code, code-execution, fastapi, llm, python, react, sandbox, typescript
- Language: TypeScript
- Homepage:
- Size: 3.6 MB
- Stars: 212
- Watchers: 1
- Forks: 43
- Open Issues: 8
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Agents: AGENTS.md
Awesome Lists containing this project
README
# Claudex
Self-hosted Claude Code workspace with multi-provider routing, sandboxed execution, and a full web IDE.
[](https://www.apache.org/licenses/LICENSE-2.0)
[](https://www.python.org/)
[](https://reactjs.org/)
[](https://fastapi.tiangolo.com/)
[](https://discord.gg/cp3sBgEX)
## Community
Join the [Discord server](https://discord.gg/cp3sBgEX).
## Why Claudex
- Claude Code as the execution harness, exposed through a self-hosted web UI
- One workflow across Anthropic, OpenAI, GitHub Copilot, OpenRouter, and custom Anthropic-compatible endpoints
- Anthropic Bridge routing for non-Anthropic providers while preserving Claude Code behavior
- Isolated sandbox backends (Docker, host)
- Extension surface: MCP servers, skills, agents, slash commands, prompts, and marketplace plugins
- Provider switching with shared working context
## Core Architecture
```text
React/Vite Frontend
-> FastAPI Backend
-> PostgreSQL + Redis (web/docker mode)
-> SQLite + in-memory cache/pubsub (desktop mode)
-> Sandbox runtime (Docker/Host)
-> Claude Code CLI + claude-agent-sdk
```
### Claude Code harness
Claudex runs chats through `claude-agent-sdk`, which drives the Claude Code CLI in the selected sandbox. This keeps Claude Code-native behavior for tools, session flow, permission modes, and MCP orchestration.
### Anthropic Bridge for non-Anthropic providers
For OpenAI, OpenRouter, and Copilot providers, Claudex starts `anthropic-bridge` inside the sandbox and routes Claude Code requests through:
- `ANTHROPIC_BASE_URL=http://127.0.0.1:3456`
- provider-specific auth secrets such as `OPENROUTER_API_KEY` and `GITHUB_COPILOT_TOKEN`
- provider-scoped model IDs like `openai/gpt-5.2-codex`, `openrouter/moonshotai/kimi-k2.5`, `copilot/gpt-5.2-codex`
```text
Claudex UI
-> Claude Agent SDK + Claude Code CLI
-> Anthropic-compatible request shape
-> Anthropic Bridge (OpenAI/OpenRouter/Copilot)
-> Target provider model
```
For Anthropic providers, Claudex uses your Claude auth token directly. For custom providers, Claudex calls your configured Anthropic-compatible `base_url`.
## Key Features
- Claude Code-native chat execution through `claude-agent-sdk`
- Anthropic Bridge provider routing with provider-scoped models (`openai/*`, `openrouter/*`, `copilot/*`)
- Multi-sandbox runtime (Docker/Host)
- MCP + custom skills/agents/commands + plugin marketplace
- Checkpoint restore and chat forking from any prior message state
- Streaming architecture with resumable SSE events and explicit cancellation
- Built-in recurring task scheduler (in-process async, no worker service)
## Quick Start (Web)
### Requirements
- Docker + Docker Compose
### Start
```bash
git clone https://github.com/Mng-dev-ai/claudex.git
cd claudex
docker compose -p claudex-web -f docker-compose.yml up -d
```
Open [http://localhost:3000](http://localhost:3000).
### Stop and logs
```bash
docker compose -p claudex-web -f docker-compose.yml down
docker compose -p claudex-web -f docker-compose.yml logs -f
```
## Desktop (macOS)
Desktop mode uses Tauri with a bundled Python backend sidecar on `localhost:8081`, with local SQLite storage.
### Download prebuilt app
- Apple Silicon DMG: [Latest Release](https://github.com/Mng-dev-ai/claudex/releases/latest)
### How it works
When running in desktop mode:
- Tauri hosts the frontend in a native macOS window
- the sidecar backend process serves the API on `8081`
- desktop uses local SQLite plus in-memory cache/pubsub (no Postgres/Redis dependency required for desktop mode)
```text
Tauri Desktop App
-> React frontend (.env.desktop)
-> bundled backend sidecar (localhost:8081)
-> local SQLite database
```
### Build and run from source
Requirements:
- Node.js
- Rust
Dev workflow:
```bash
cd frontend
npm install
npm run desktop:dev
```
Build (unsigned dev):
```bash
cd frontend
npm run desktop:build
```
App bundle output:
- `frontend/src-tauri/target/release/bundle/macos/Claudex.app`
Desktop troubleshooting:
- Backend unavailable: wait for sidecar startup to finish
- Database errors: verify local app data directory permissions
- Port conflict: free port `8081` if already in use
## Provider Setup
Configure providers in `Settings -> Providers`.
- `anthropic`: paste token from `claude setup-token`
- `openai`: authenticate with OpenAI device flow in UI
- `copilot`: authenticate with GitHub device flow in UI
- `openrouter`: add OpenRouter API key and model IDs
- `custom`: set Anthropic-compatible `base_url`, token, and model IDs
### Model examples
- OpenAI/Codex: `gpt-5.2-codex`, `gpt-5.2`, `gpt-5.3-codex`
- OpenRouter catalog examples: `moonshotai/kimi-k2.5`, `minimax/minimax-m2.1`, `google/gemini-3-pro-preview`
- Custom gateways: models like `GLM-5`, `M2.5`, or private org-specific endpoints (depends on your backend compatibility)
## Shared Working Context
Switching providers does not require a new workflow:
- Same sandbox filesystem/workdir
- Same `.claude` resources (skills, agents, commands)
- Same MCP configuration in Claudex
- Same chat-level execution flow
This is the main value of using Claude Code as the harness while changing inference providers behind Anthropic Bridge.
## Services and Ports (Web)
- Frontend: `3000`
- Backend API: `8080`
- PostgreSQL: `5432`
- Redis: `6379`
- VNC: `5900`
- VNC Web: `6080`
- OpenVSCode server: `8765`
## API and Admin
- API docs: [http://localhost:8080/api/v1/docs](http://localhost:8080/api/v1/docs)
- Admin panel: [http://localhost:8080/admin](http://localhost:8080/admin)
## Health and Ops
- Liveness endpoint: `GET /health`
- Readiness endpoint: `GET /api/v1/readyz`
- web mode checks database + Redis
- desktop mode checks database (SQLite) only
## Deployment
- VPS/Coolify guide: [docs/coolify-installation-guide.md](docs/coolify-installation-guide.md)
- Production setup uses frontend at `/` and API under `/api/*`
## Screenshots


## Tech Stack
- Frontend: React 19, TypeScript, Vite, TailwindCSS, Zustand, React Query
- Backend: FastAPI, SQLAlchemy, Redis, PostgreSQL/SQLite, Granian
- Runtime: Claude Code CLI, claude-agent-sdk, anthropic-bridge
## License
Apache 2.0. See [LICENSE](LICENSE).
## Contributing
Contributions are welcome. Please open an issue first to discuss what you would like to change, then submit a pull request.
## References
- Anthropic Claude Code SDK: [docs.anthropic.com/s/claude-code-sdk](https://docs.anthropic.com/s/claude-code-sdk)
- Anthropic Bridge package: [pypi.org/project/anthropic-bridge](https://pypi.org/project/anthropic-bridge/)
- OpenAI Codex CLI sign-in: [help.openai.com/en/articles/11381614](https://help.openai.com/en/articles/11381614)
- OpenRouter API keys: [openrouter.ai/docs/api-keys](https://openrouter.ai/docs/api-keys)
- GitHub Copilot plans: [github.com/features/copilot/plans](https://github.com/features/copilot/plans)