{"id":39584728,"url":"https://github.com/9j/claude-code-mux","last_synced_at":"2026-01-26T15:01:17.130Z","repository":{"id":324577806,"uuid":"1097704651","full_name":"9j/claude-code-mux","owner":"9j","description":"High-performance AI routing proxy built in Rust with automatic failover, priority-based routing, and    support for 15+ providers (Anthropic, OpenAI, Cerebras, Minimax, Kimi, etc.)","archived":false,"fork":false,"pushed_at":"2025-11-19T19:37:08.000Z","size":801,"stargazers_count":406,"open_issues_count":7,"forks_count":40,"subscribers_count":3,"default_branch":"main","last_synced_at":"2025-12-13T13:10:20.421Z","etag":null,"topics":["ai","anthropic","claude-code","claude-code-router","openai"],"latest_commit_sha":null,"homepage":"","language":"Rust","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/9j.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":"AGENTS.md","dco":null,"cla":null}},"created_at":"2025-11-16T17:19:45.000Z","updated_at":"2025-12-12T00:44:46.000Z","dependencies_parsed_at":null,"dependency_job_id":null,"html_url":"https://github.com/9j/claude-code-mux","commit_stats":null,"previous_names":["9j/claude-code-mux"],"tags_count":17,"template":false,"template_full_name":null,"purl":"pkg:github/9j/claude-code-mux","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/9j%2Fclaude-code-mux","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/9j%2Fclaude-code-mux/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/9j%2Fclaude-code-mux/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/9j%2Fclaude-code-mux/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/9j","download_url":"https://codeload.github.com/9j/claude-code-mux/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/9j%2Fclaude-code-mux/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28781308,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-26T13:55:28.044Z","status":"ssl_error","status_checked_at":"2026-01-26T13:55:26.068Z","response_time":59,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","anthropic","claude-code","claude-code-router","openai"],"created_at":"2026-01-18T07:35:27.099Z","updated_at":"2026-01-26T15:01:17.123Z","avatar_url":"https://github.com/9j.png","language":"Rust","readme":"# Claude Code Mux\n\n[![CI](https://github.com/9j/claude-code-mux/workflows/CI/badge.svg)](https://github.com/9j/claude-code-mux/actions)\n[![Latest Release](https://img.shields.io/github/v/release/9j/claude-code-mux)](https://github.com/9j/claude-code-mux/releases/latest)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n[![Rust](https://img.shields.io/badge/rust-1.70%2B-orange.svg)](https://www.rust-lang.org/)\n[![GitHub Stars](https://img.shields.io/github/stars/9j/claude-code-mux?style=social)](https://github.com/9j/claude-code-mux)\n[![GitHub Forks](https://img.shields.io/github/forks/9j/claude-code-mux?style=social)](https://github.com/9j/claude-code-mux/fork)\n\nOpenRouter met Claude Code Router. They had a baby.\n\n---\n\nNow your coding assistant can use GLM 4.6 for one task, Kimi K2 Thinking for another, and Minimax M2 for a third. All in the same session. When your primary provider goes down, it falls back to your backup automatically.\n\n⚡️ **Multi-model intelligence with provider resilience**\n\nA lightweight, Rust-powered proxy that provides intelligent model routing, provider failover, streaming support, and full Anthropic API compatibility for Claude Code.\n\n```\nClaude Code → Claude Code Mux → Multiple AI Providers\n              (Anthropic API)    (OpenAI/Anthropic APIs + Streaming)\n```\n\n## Table of Contents\n\n- [Key Features](#key-features)\n- [Installation](#installation)\n- [Quick Start](#quick-start)\n- [Screenshots](#screenshots)\n- [Usage Guide](#usage-guide)\n- [Routing Logic](#routing-logic)\n- [Configuration Examples](#configuration-examples)\n- [Supported Providers](#supported-providers)\n- [Advanced Features](#advanced-features)\n- [CLI Usage](#cli-usage)\n- [Troubleshooting](#troubleshooting)\n- [FAQ](#faq)\n- [Performance](#performance)\n- [Why Choose Claude Code Mux?](#why-choose-claude-code-mux)\n- [Documentation](#documentation)\n- [Changelog](#changelog)\n- [Contributing](#contributing)\n- [License](#license)\n\n## Key Features\n\n### 🎯 Core Features\n- ✨ **Modern Admin UI** - Beautiful web interface with auto-save and URL-based navigation\n- 🔐 **OAuth 2.0 Support** - FREE access for Claude Pro/Max, ChatGPT Plus/Pro, and Google AI Pro/Ultra\n- 🧠 **Intelligent Routing** - Auto-route by task type (websearch, reasoning, background, default)\n- 🔄 **Provider Failover** - Automatic fallback to backup providers with priority-based routing\n- 🌊 **Streaming Support** - Full Server-Sent Events (SSE) streaming for real-time responses\n- 🌐 **Multi-Provider Support** - 18+ providers including OpenAI, Anthropic, Google Gemini/Vertex AI, Groq, ZenMux, etc.\n- ⚡️ **High Performance** - ~5MB RAM, \u003c1ms routing overhead (Rust powered)\n- 🎯 **Unified API** - Full Anthropic Messages API compatibility\n\n### 🚀 Advanced Features\n- 🔀 **Auto-mapping** - Regex-based model name transformation before routing (e.g., transform all `claude-*` to default model)\n- 🎯 **Background Detection** - Configurable regex patterns for background task detection\n- 🤖 **Multi-Agent Support** - Dynamic model switching via `CCM-SUBAGENT-MODEL` tags\n- 📊 **Live Testing** - Built-in test interface to verify routing and responses\n- ⚙️ **Centralized Settings** - Dedicated Settings tab for regex pattern management\n\n## Screenshots\n\n\u003cdetails\u003e\n\u003csummary\u003e📸 Click to view screenshots (5 images)\u003c/summary\u003e\n\n### Overview Dashboard\n![Dashboard showing router configuration, providers, and models summary](docs/images/dashboard.png)\n*Main dashboard with router configuration and provider management*\n\n### Provider Management\n![Provider management interface with add/edit capabilities](docs/images/providers.png)\n*Add and manage multiple AI providers with automatic format translation*\n\n### Model Mappings with Fallback\n![Model configuration with priority-based fallback routing](docs/images/models.png)\n*Configure models with priority-based fallback routing*\n\n### Router Configuration\n![Router configuration interface for intelligent routing rules](docs/images/routing.png)\n*Set up intelligent routing rules for different task types*\n\n### Live Testing Interface\n![Testing interface for verifying configuration with real API calls](docs/images/testing.png)\n*Test your configuration with live API requests and responses*\n\n\u003c/details\u003e\n\n## Supported Providers\n\n**18+ AI providers with automatic format translation, streaming, and failover:**\n\n- **Anthropic-compatible**: Anthropic (API Key/OAuth), ZenMux, z.ai, Minimax, Kimi\n- **OpenAI-compatible**: OpenAI, OpenRouter, Groq, Together, Fireworks, Deepinfra, Cerebras, Moonshot, Nebius, NovitaAI, Baseten\n- **Google AI**: Gemini (OAuth/API Key), Vertex AI (GCP ADC)\n\n\u003cdetails\u003e\n\u003csummary\u003e📋 View full provider details\u003c/summary\u003e\n\n### Anthropic-Compatible (Native Format)\n- **Anthropic** - Official Claude API provider (supports both API Key and OAuth)\n- **Anthropic (OAuth)** - 🆓 **FREE for Claude Pro/Max subscribers** via OAuth 2.0\n- **ZenMux** - Unified API gateway (Sunnyvale, CA)\n- **z.ai** - China-based, GLM models\n- **Minimax** - China-based, MiniMax-M2 model\n- **Kimi For Coding** - Premium membership for Kimi\n\n### OpenAI-Compatible\n- **OpenAI** - Official OpenAI API (supports both API Key and OAuth)\n- **OpenAI (OAuth)** - 🆓 **FREE for ChatGPT Plus/Pro subscribers** via OAuth 2.0 (GPT-5.1, GPT-5.1 Codex)\n- **OpenRouter** - Unified API gateway (500+ models)\n- **Groq** - LPU inference (ultra-fast)\n- **Together AI** - Open source model inference\n- **Fireworks AI** - Fast inference platform\n- **Deepinfra** - GPU inference\n- **Cerebras** - Wafer-Scale Engine inference\n- **Moonshot AI** - China-based, Kimi models (OpenAI-compatible)\n- **Nebius** - AI inference platform\n- **NovitaAI** - GPU cloud platform\n- **Baseten** - ML deployment platform\n\n### Google AI\n- **Gemini** - Google AI Studio/Code Assist API (supports both OAuth and API Key)\n- **Gemini (OAuth)** - 🆓 **FREE for Google AI Pro/Ultra subscribers** via OAuth 2.0 (Code Assist API)\n- **Vertex AI** - GCP platform with ADC authentication (supports Gemini, Claude, Llama via Model Garden)\n\n\u003c/details\u003e\n\n## Installation\n\n### Option 1: Download Pre-built Binaries (Recommended)\n\nDownload the latest release for your platform from [GitHub Releases](https://github.com/9j/claude-code-mux/releases/latest).\n\n#### Linux (x86_64)\n```bash\n# Download and extract (glibc)\ncurl -L https://github.com/9j/claude-code-mux/releases/latest/download/ccm-linux-x86_64.tar.gz | tar xz\n\n# Or download musl version (static linking, more portable)\ncurl -L https://github.com/9j/claude-code-mux/releases/latest/download/ccm-linux-x86_64-musl.tar.gz | tar xz\n\n# Move to PATH\nsudo mv ccm /usr/local/bin/\n```\n\n#### macOS (Intel)\n```bash\n# Download and extract\ncurl -L https://github.com/9j/claude-code-mux/releases/latest/download/ccm-macos-x86_64.tar.gz | tar xz\n\n# Move to PATH\nsudo mv ccm /usr/local/bin/\n```\n\n#### macOS (Apple Silicon)\n```bash\n# Download and extract\ncurl -L https://github.com/9j/claude-code-mux/releases/latest/download/ccm-macos-aarch64.tar.gz | tar xz\n\n# Move to PATH\nsudo mv ccm /usr/local/bin/\n```\n\n#### Windows\n1. Download [ccm-windows-x86_64.zip](https://github.com/9j/claude-code-mux/releases/latest/download/ccm-windows-x86_64.zip)\n2. Extract the ZIP file\n3. Add the directory containing `ccm.exe` to your PATH\n\n#### Verify Installation\n```bash\nccm --version\n```\n\n### Option 2: Install via Cargo\n\nIf you have Rust installed, you can install directly from crates.io:\n\n```bash\ncargo install claude-code-mux\n```\n\nThis will download, compile, and install the `ccm` binary to your cargo bin directory (usually `~/.cargo/bin/`).\n\n#### Verify Installation\n```bash\nccm --version\n```\n\n### Option 3: Build from Source\n\n#### Prerequisites\n- Rust 1.70+ (install from [rustup.rs](https://rustup.rs/))\n\n#### Build Steps\n\n```bash\n# Clone the repository\ngit clone https://github.com/9j/claude-code-mux\ncd claude-code-mux\n\n# Build the release binary\ncargo build --release\n\n# The binary will be available at target/release/ccm\n```\n\n#### Install to PATH (Optional)\n\n```bash\n# Copy to /usr/local/bin for global access\nsudo cp target/release/ccm /usr/local/bin/\n\n# Or add to your shell profile (e.g., ~/.zshrc or ~/.bashrc)\nexport PATH=\"$PATH:/path/to/claude-code-mux/target/release\"\n```\n\n#### Run Directly Without Installing (Optional)\n\n```bash\n# From the project directory\ncargo run --release -- start\n```\n\n## Quick Start\n\n### 1. Start Claude Code Mux\n\n```bash\nccm start\n```\n\nThe server will start on `http://127.0.0.1:13456` with a web-based admin UI.\n\n\u003e **💡 First-time users**: A default configuration file will be automatically created at:\n\u003e - **Unix/Linux/macOS**: `~/.claude-code-mux/config.toml`\n\u003e - **Windows**: `%USERPROFILE%\\.claude-code-mux\\config.toml`\n\n### 2. Open Admin UI\n\nNavigate to:\n```\nhttp://127.0.0.1:13456\n```\n\nYou'll see a modern admin interface with these tabs:\n- **Overview** - System status and configuration summary\n- **Providers** - Manage API providers\n- **Models** - Configure model mappings and fallbacks\n- **Router** - Set up routing rules (auto-saves on change!)\n- **Test** - Test your configuration with live requests\n\n### 3. Configure Claude Code\n\nSet Claude Code to use the proxy:\n\n```bash\nexport ANTHROPIC_BASE_URL=\"http://127.0.0.1:13456\"\nexport ANTHROPIC_API_KEY=\"any-string\"\nclaude\n```\n\nThat's it! Your setup is complete.\n\n## Usage Guide\n\n### Step 1: Add Providers\n\nNavigate to **Providers** tab → Click **\"Add Provider\"**\n\n#### Example: Add Anthropic with OAuth (🆓 FREE for Claude Pro/Max)\n1. Select provider type: **Anthropic**\n2. Enter provider name: `claude-max`\n3. Select authentication: **OAuth (Claude Pro/Max)**\n4. Click **\"🔐 Start OAuth Login\"**\n5. Authorize in the popup window\n6. Copy and paste the authorization code\n7. Click **\"Complete Authentication\"**\n8. Click **\"Add Provider\"**\n\n\u003e **💡 Pro Tip**: Claude Pro/Max subscribers get **unlimited API access for FREE** via OAuth!\n\n#### Example: Add ZenMux Provider\n1. Select provider type: **ZenMux**\n2. Enter provider name: `zenmux`\n3. Select authentication: **API Key**\n4. Enter API key: `your-zenmux-api-key`\n5. Click **\"Add Provider\"**\n\n#### Example: Add OpenAI Provider\n1. Select provider type: **OpenAI**\n2. Enter provider name: `openai`\n3. Enter API key: `sk-...`\n4. Click **\"Add Provider\"**\n\n#### Example: Add z.ai Provider\n1. Select provider type: **z.ai**\n2. Enter provider name: `zai`\n3. Enter API key: `your-zai-api-key`\n4. Click **\"Add Provider\"**\n\n#### Example: Add Google Gemini with OAuth (🆓 FREE for Google AI Pro/Ultra)\n1. Select provider type: **Google Gemini**\n2. Enter provider name: `gemini-pro`\n3. Select authentication: **OAuth (Google AI Pro/Ultra)**\n4. Click **\"🔐 Start OAuth Login\"**\n5. Authorize in the popup window\n6. Copy and paste the authorization code\n7. Click **\"Complete Authentication\"**\n8. Click **\"Add Provider\"**\n\n\u003e **💡 Pro Tip**: Google AI Pro/Ultra subscribers get **unlimited API access for FREE** via OAuth!\n\n#### Example: Add Vertex AI Provider (GCP)\n1. Select provider type: **☁️ Vertex AI**\n2. Enter provider name: `vertex-ai`\n3. Enter GCP Project ID: `your-gcp-project-id`\n4. Enter Location: `us-central1` (or your preferred region)\n5. Click **\"Add Provider\"**\n\n\u003e **Note**: Vertex AI uses Application Default Credentials (ADC). Make sure you've run `gcloud auth application-default login` first.\n\n**Supported Providers**:\n- Anthropic-compatible: Anthropic (API Key or OAuth), ZenMux, z.ai, Minimax, Kimi\n- OpenAI-compatible: OpenAI, OpenRouter, Groq, Together, Fireworks, Deepinfra, Cerebras, Nebius, NovitaAI, Baseten\n- Google AI: Gemini (OAuth/API Key), Vertex AI (GCP ADC)\n\n### Step 2: Add Model Mappings\n\nNavigate to **Models** tab → Click **\"Add Model\"**\n\n#### Example: Minimax M2 (Ultra-fast, Low Cost)\n1. Model Name: `minimax-m2`\n2. Add mapping:\n   - Provider: `minimax`\n   - Actual Model: `MiniMax M2`\n   - Priority: `1`\n3. Click **\"Add Model\"**\n\n\u003e **Why Minimax M2?** - $0.30/$1.20 per M tokens (8% of Claude Sonnet 4.5 cost), 100 TPS throughput, MoE architecture\n\n#### Example: GLM-4.6 with Fallback (Cost Optimized)\n1. Model Name: `glm-4.6`\n2. Add mappings:\n   - **Mapping 1** (Primary):\n     - Provider: `zai`\n     - Actual Model: `glm-4.6`\n     - Priority: `1`\n   - **Mapping 2** (Fallback):\n     - Provider: `openrouter`\n     - Actual Model: `z-ai/glm-4.6`\n     - Priority: `2`\n3. Click **\"+ Fallback Provider Add\"** to add more fallbacks\n4. Click **\"Add Model\"**\n\n\u003e **How Fallback Works**: If `zai` provider fails, automatically falls back to `openrouter`\n\u003e\n\u003e **GLM-4.6 Pricing**: $0.60/$2.20 per M tokens (90% cheaper than Claude Sonnet 4.5), 200K context window\n\n### Step 3: Configure Router\n\nNavigate to **Router** tab\n\nConfigure routing rules (auto-saves on change!):\n- **Default Model**: `minimax-m2` (general tasks - ultra-fast, 8% of Claude cost)\n- **Think Model**: `kimi-k2` (plan mode with reasoning - 256K context)\n- **Background Model**: `glm-4.5-air` (simple background tasks)\n- **WebSearch Model**: `glm-4.6` (web search tasks)\n- **Auto-map Regex Pattern**: `^claude-` (transform Claude models before routing)\n- **Background Task Regex Pattern**: `(?i)claude.*haiku` (detect background tasks)\n\n### Step 3.5: Configure Regex Patterns (Optional)\n\nNavigate to **Settings** tab for centralized regex management:\n\n- **Auto-mapping Pattern**: Regex to match models for transformation (e.g., `^claude-`)\n  - Matched models are transformed to the default model\n  - Then routing logic (WebSearch/Think/Background) is applied\n\n- **Background Task Pattern**: Regex to detect background tasks (e.g., `(?i)claude.*haiku`)\n  - Matches against the ORIGINAL model name (before auto-mapping)\n  - Matched models use the background model\n\n### Step 4: Save Configuration\n\nClick **\"💾 Save to Server\"** to save configuration to disk, or **\"🔄 Save \u0026 Restart\"** to save and restart the server.\n\n\u003e **Note**: Router configuration auto-saves to localStorage on change, but you need to click \"Save to Server\" to persist to disk.\n\n### Step 5: Test Your Setup\n\nNavigate to **Test** tab:\n1. Select a model (e.g., `minimax-m2` or `glm-4.6`)\n2. Enter a message: `Hello, test message`\n3. Click **\"Send Message\"**\n4. View the response and check routing logs\n\n## Routing Logic\n\n**Flow**: Auto-map (transform) → WebSearch \u003e Subagent \u003e Think \u003e Background \u003e Default\n\n### 0. Auto-mapping (Model Name Transformation)\n- **Trigger**: Model name matches `auto_map_regex` pattern\n- **Example**: Request with `model=\"claude-4-5-sonnet\"` and regex `^claude-`\n- **Action**: Transform `claude-4-5-sonnet` → `minimax-m2` (default model)\n- **Then**: Continue to routing logic below\n- **Configuration**: Set in Router or Settings tab\n\n\u003e **Key Point**: Auto-mapping is NOT a routing decision - it transforms the model name BEFORE routing logic is applied.\n\n### 1. WebSearch (Highest Priority)\n- **Trigger**: Request contains `web_search` tool in tools array\n- **Example**: Claude Code using web search tool\n- **Routes to**: `websearch` model (e.g., GLM-4.6)\n\n### 2. Subagent Model\n- **Trigger**: System prompt contains `\u003cCCM-SUBAGENT-MODEL\u003emodel-name\u003c/CCM-SUBAGENT-MODEL\u003e` tag\n- **Example**: AI agent specifying model for sub-task\n- **Routes to**: Specified model (tag auto-removed)\n\n### 3. Think Mode\n- **Trigger**: Request has `thinking` field with `type: \"enabled\"`\n- **Example**: Claude Code Plan Mode (`/plan`)\n- **Routes to**: `think` model (e.g., Kimi K2 Thinking, Claude Opus)\n\n### 4. Background Tasks\n- **Trigger**: ORIGINAL model name matches `background_regex` pattern\n- **Default Pattern**: `(?i)claude.*haiku` (case-insensitive)\n- **Example**: Request with `model=\"claude-4-5-haiku\"` (checked BEFORE auto-mapping)\n- **Routes to**: `background` model (e.g., GLM-4.5-air)\n- **Configuration**: Set in Router or Settings tab\n\n\u003e **Important**: Background detection uses the ORIGINAL model name, not the auto-mapped one.\n\n### 5. Default (Fallback)\n- **Trigger**: No routing conditions matched\n- **Routes to**: Transformed model name (if auto-mapped) or original model name\n\n## Routing Examples\n\n### Example 1: Claude Haiku with Web Search\n```\nRequest: model=\"claude-4-5-haiku\", tools=[web_search]\nConfig: auto_map_regex=\"^claude-\", background_regex=\"(?i)claude.*haiku\", websearch=\"glm-4.6\"\n\nFlow:\n1. Auto-map: \"claude-4-5-haiku\" → \"minimax-m2\" (transformed)\n2. WebSearch check: tools has web_search → Route to \"glm-4.6\"\nResult: glm-4.6 (websearch model)\n```\n\n### Example 2: Claude Haiku (No Special Conditions)\n```\nRequest: model=\"claude-4-5-haiku\"\nConfig: auto_map_regex=\"^claude-\", background_regex=\"(?i)claude.*haiku\", background=\"glm-4.5-air\"\n\nFlow:\n1. Auto-map: \"claude-4-5-haiku\" → \"minimax-m2\" (transformed)\n2. WebSearch check: No web_search tool\n3. Think check: No thinking field\n4. Background check on ORIGINAL: \"claude-4-5-haiku\" matches \"(?i)claude.*haiku\" → Route to \"glm-4.5-air\"\nResult: glm-4.5-air (background model)\n```\n\n### Example 3: Claude Sonnet with Think Mode\n```\nRequest: model=\"claude-4-5-sonnet\", thinking={type:\"enabled\"}\nConfig: auto_map_regex=\"^claude-\", think=\"kimi-k2-thinking\"\n\nFlow:\n1. Auto-map: \"claude-3-5-sonnet\" → \"minimax-m2\" (transformed)\n2. WebSearch check: No web_search tool\n3. Think check: thinking.type=\"enabled\" → Route to \"kimi-k2-thinking\"\nResult: kimi-k2-thinking (think model)\n```\n\n### Example 4: Non-Claude Model (No Auto-mapping)\n```\nRequest: model=\"glm-4.6\"\nConfig: auto_map_regex=\"^claude-\", default=\"minimax-m2\"\n\nFlow:\n1. Auto-map: \"glm-4.6\" doesn't match \"^claude-\" → No transformation\n2. WebSearch check: No web_search tool\n3. Think check: No thinking field\n4. Background check: \"glm-4.6\" doesn't match background regex\n5. Default: Use model name as-is\nResult: glm-4.6 (original model name, routed through model mappings)\n```\n\n## Configuration Examples\n\n### Cost Optimized Setup (~$0.35/1M tokens avg)\n\n**Providers**:\n- Minimax (ultra-fast, ultra-cheap)\n- z.ai (GLM models)\n- Kimi (for thinking tasks)\n- OpenRouter (fallback)\n\n**Models**:\n- `minimax-m2` → Minimax (`MiniMax M2`) — $0.30/$1.20 per M tokens\n- `glm-4.6` → z.ai (`glm-4.6`) with OpenRouter fallback — $0.60/$2.20 per M tokens\n- `glm-4.5-air` → z.ai (`glm-4.5-air`) — Lower cost than GLM-4.6\n- `kimi-k2-thinking` → Kimi (`kimi-k2-thinking`) — Reasoning optimized, 256K context\n\n**Routing**:\n- Default: `minimax-m2` (8% of Claude cost, 100 TPS)\n- Think: `kimi-k2-thinking` (thinking model with 256K context)\n- Background: `glm-4.5-air` (simple tasks)\n- WebSearch: `glm-4.6` (web search + reasoning)\n- Auto-map Regex: `^claude-` (transform Claude models to minimax-m2)\n- Background Regex: `(?i)claude.*haiku` (detect Haiku models for background)\n\n**Cost Comparison** (per 1M tokens):\n- Minimax M2: $0.30 input / $1.20 output\n- GLM-4.6: $0.60 input / $2.20 output\n- Claude Sonnet 4.5: $3.00 input / $15.00 output\n- **Savings**: ~90% cost reduction vs Claude\n\n### Quality Focused Setup\n\n**Providers**:\n- Anthropic (native Claude)\n- OpenRouter (for fallbacks)\n\n**Models**:\n- `claude-sonnet-4-5` → Anthropic native\n- `claude-opus-4-1` → Anthropic native\n\n**Routing**:\n- Default: `claude-sonnet-4-5`\n- Think: `claude-opus-4-1`\n- Background: `claude-haiku-4-5`\n- WebSearch: `claude-sonnet-4-5`\n\n### Multi-Provider with Fallback\n\n**Providers**:\n- Minimax (primary, ultra-fast)\n- z.ai (for GLM models)\n- OpenRouter (fallback for all)\n\n**Models**:\n- `minimax-m2`:\n  - Priority 1: Minimax → `MiniMax-M2`\n  - Priority 2: OpenRouter → `minimax/minimax-m2` (if available)\n- `glm-4.6`:\n  - Priority 1: z.ai → `glm-4.6`\n  - Priority 2: OpenRouter → `z-ai/glm-4.6`\n\n**Routing**:\n- Default: `minimax-m2` (falls back to OpenRouter if Minimax fails)\n- Think: `glm-4.6` (with OpenRouter fallback)\n- Background: `glm-4.5-air`\n- WebSearch: `glm-4.6`\n\n## Advanced Features\n\n### OAuth Authentication (FREE for Claude Pro/Max, ChatGPT Plus/Pro \u0026 Google AI Pro/Ultra)\n\nClaude Pro/Max, ChatGPT Plus/Pro, and Google AI Pro/Ultra subscribers can use their respective APIs **completely free** via OAuth 2.0 authentication.\n\n#### Setting Up OAuth\n\n**Via Web UI** (Recommended):\n\n**For Claude Pro/Max**:\n1. Navigate to **Providers** tab → **\"Add Provider\"**\n2. Select provider type: **Anthropic**\n3. Enter provider name (e.g., `claude-max`)\n4. Select authentication: **OAuth (Claude Pro/Max)**\n5. Click **\"🔐 Start OAuth Login\"**\n6. Complete authorization in popup window\n7. Copy and paste the authorization code\n8. Click **\"Complete Authentication\"**\n\n**For ChatGPT Plus/Pro**:\n1. Navigate to **Providers** tab → **\"Add Provider\"**\n2. Select provider type: **OpenAI**\n3. Enter provider name (e.g., `chatgpt-codex`)\n4. Select authentication: **OAuth (ChatGPT Plus/Pro)**\n5. Click **\"🔐 Start OAuth Login\"**\n6. Complete authorization in popup window (port 1455)\n7. Copy and paste the authorization code\n8. Click **\"Complete Authentication\"**\n\n**For Google AI Pro/Ultra**:\n1. Navigate to **Providers** tab → **\"Add Provider\"**\n2. Select provider type: **Google Gemini**\n3. Enter provider name (e.g., `gemini-pro`)\n4. Select authentication: **OAuth (Google AI Pro/Ultra)**\n5. Click **\"🔐 Start OAuth Login\"**\n6. Complete authorization in popup window\n7. Copy and paste the authorization code\n8. Click **\"Complete Authentication\"**\n\n\u003e **💡 Supported Models**:\n\u003e - **Claude OAuth**: All Claude models (Opus, Sonnet, Haiku)\n\u003e - **ChatGPT OAuth**: GPT-5.1, GPT-5.1 Codex (with reasoning blocks converted to thinking)\n\u003e - **Gemini OAuth**: All Gemini models via Code Assist API (Pro, Flash, Ultra)\n\n**Via CLI Tool**:\n```bash\n# Run OAuth login tool\ncargo run --example oauth_login\n\n# Or if installed\n./examples/oauth_login\n```\n\nThe tool will:\n1. Generate an authorization URL\n2. Open your browser for authorization\n3. Prompt for the authorization code\n4. Exchange code for access/refresh tokens\n5. Save tokens to `~/.claude-code-mux/oauth_tokens.json`\n\n#### Managing OAuth Tokens\n\nNavigate to **Settings** tab → **OAuth Tokens** section to:\n- **View token status** (Active/Needs Refresh/Expired)\n- **Refresh tokens** manually (auto-refresh happens 5 minutes before expiry)\n- **Delete tokens** when no longer needed\n\n**Token Features**:\n- 🔐 Secure PKCE-based OAuth 2.0 flow\n- 🔄 Automatic token refresh (5 min before expiry)\n- 💾 Persistent storage with file permissions (0600)\n- 🎨 Visual status indicators (green/yellow/red)\n\n**Security Notes**:\n- Tokens are stored with `0600` permissions (owner read/write only)\n- Never commit `oauth_tokens.json` to version control\n- Tokens auto-refresh before expiration\n- PKCE protects against authorization code interception\n\n#### OAuth API Endpoints\n\nFor advanced integrations:\n- `POST /api/oauth/authorize` - Get authorization URL\n- `POST /api/oauth/exchange` - Exchange code for tokens\n- `GET /api/oauth/tokens` - List all tokens\n- `POST /api/oauth/tokens/refresh` - Refresh a token\n- `POST /api/oauth/tokens/delete` - Delete a token\n\nSee `docs/OAUTH_TESTING.md` for detailed API documentation.\n\n### Auto-mapping with Regex\n\nAutomatically transform model names before routing logic is applied:\n\n1. Navigate to **Router** or **Settings** tab\n2. Set **Auto-map Regex Pattern**: `^claude-`\n3. All requests for `claude-*` models will be transformed to your default model\n4. Then routing logic (WebSearch/Think/Background) is applied to the transformed request\n\n**Use Cases**:\n- Transform all Claude models to cost-optimized alternative: `^claude-`\n- Transform both Claude and GPT models: `^(claude-|gpt-)`\n- Transform specific models only: `^(claude-sonnet|claude-opus)`\n\n**Example**:\n```\nConfig: auto_map_regex=\"^claude-\", default=\"minimax-m2\", websearch=\"glm-4.6\"\nRequest: model=\"claude-sonnet\", tools=[web_search]\n\nFlow:\n1. Transform: \"claude-sonnet\" → \"minimax-m2\"\n2. Route: WebSearch detected → \"glm-4.6\"\nResult: glm-4.6 model\n```\n\n### Background Task Detection with Regex\n\nAutomatically detect and route background tasks using regex patterns:\n\n1. Navigate to **Router** or **Settings** tab\n2. Set **Background Regex Pattern**: `(?i)claude.*haiku`\n3. All requests matching this pattern will use your background model\n\n**Use Cases**:\n- Route all Haiku models to cheap background model: `(?i)claude.*haiku`\n- Route specific model tiers: `(?i)(haiku|flash|mini)`\n- Custom patterns for your naming convention\n\n**Important**: Background detection checks the ORIGINAL model name (before auto-mapping)\n\n### Streaming Responses\n\nFull Server-Sent Events (SSE) streaming support:\n\n```bash\ncurl -X POST http://127.0.0.1:13456/v1/messages \\\n  -H \"Content-Type: application/json\" \\\n  -H \"anthropic-version: 2023-06-01\" \\\n  -d '{\n    \"model\": \"minimax-m2\",\n    \"max_tokens\": 1000,\n    \"stream\": true,\n    \"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}]\n  }'\n```\n\n**Supported Providers**:\n- ✅ Anthropic-compatible: ZenMux, z.ai, Kimi, Minimax\n- ✅ OpenAI-compatible: OpenAI, OpenRouter, Groq, Together, Fireworks, etc.\n\n### Provider Failover\n\nAutomatic failover with priority-based routing:\n\n```toml\n[[models]]\nname = \"glm-4.6\"\n\n[[models.mappings]]\nactual_model = \"glm-4.6\"\npriority = 1\nprovider = \"zai\"\n\n[[models.mappings]]\nactual_model = \"z-ai/glm-4.6\"\npriority = 2\nprovider = \"openrouter\"\n```\n\nIf z.ai fails, automatically falls back to OpenRouter. Works with all providers!\n\n## CLI Usage\n\n### Start the Server\n\n```bash\n# Start with default config (~/.claude-code-mux/config.toml)\n# Config file is automatically created if it doesn't exist\nccm start\n\n# Start with custom config\nccm start --config path/to/config.toml\n\n# Start on custom port\nccm start --port 8080\n```\n\n**Default Config Location**:\n- **Unix/Linux/macOS**: `~/.claude-code-mux/config.toml`\n- **Windows**: `%USERPROFILE%\\.claude-code-mux\\config.toml` (e.g., `C:\\Users\\\u003cusername\u003e\\.claude-code-mux\\config.toml`)\n\n### Run in Background\n\n#### Using nohup (Unix/Linux/macOS)\n```bash\n# Start in background\nnohup ccm start \u003e ccm.log 2\u003e\u00261 \u0026\n\n# Check if running\nps aux | grep ccm\n\n# Stop the server\npkill ccm\n```\n\n#### Using systemd (Linux)\nCreate `/etc/systemd/system/ccm.service`:\n\n```ini\n[Unit]\nDescription=Claude Code Mux\nAfter=network.target\n\n[Service]\nType=simple\nUser=your-username\nWorkingDirectory=/path/to/claude-code-mux\nExecStart=/path/to/claude-code-mux/target/release/ccm start\nRestart=on-failure\nRestartSec=5s\n\n[Install]\nWantedBy=multi-user.target\n```\n\nThen:\n```bash\n# Reload systemd\nsudo systemctl daemon-reload\n\n# Enable on boot\nsudo systemctl enable ccm\n\n# Start service\nsudo systemctl start ccm\n\n# Check status\nsudo systemctl status ccm\n\n# View logs\nsudo journalctl -u ccm -f\n```\n\n#### Using launchd (macOS)\nCreate `~/Library/LaunchAgents/com.ccm.plist`:\n\n```xml\n\u003c?xml version=\"1.0\" encoding=\"UTF-8\"?\u003e\n\u003c!DOCTYPE plist PUBLIC \"-//Apple//DTD PLIST 1.0//EN\" \"http://www.apple.com/DTDs/PropertyList-1.0.dtd\"\u003e\n\u003cplist version=\"1.0\"\u003e\n\u003cdict\u003e\n    \u003ckey\u003eLabel\u003c/key\u003e\n    \u003cstring\u003ecom.ccm\u003c/string\u003e\n    \u003ckey\u003eProgramArguments\u003c/key\u003e\n    \u003carray\u003e\n        \u003cstring\u003e/path/to/claude-code-mux/target/release/ccm\u003c/string\u003e\n        \u003cstring\u003estart\u003c/string\u003e\n    \u003c/array\u003e\n    \u003ckey\u003eRunAtLoad\u003c/key\u003e\n    \u003ctrue/\u003e\n    \u003ckey\u003eKeepAlive\u003c/key\u003e\n    \u003ctrue/\u003e\n    \u003ckey\u003eStandardOutPath\u003c/key\u003e\n    \u003cstring\u003e/tmp/ccm.log\u003c/string\u003e\n    \u003ckey\u003eStandardErrorPath\u003c/key\u003e\n    \u003cstring\u003e/tmp/ccm.error.log\u003c/string\u003e\n\u003c/dict\u003e\n\u003c/plist\u003e\n```\n\nThen:\n```bash\n# Load and start\nlaunchctl load ~/Library/LaunchAgents/com.ccm.plist\n\n# Stop\nlaunchctl unload ~/Library/LaunchAgents/com.ccm.plist\n\n# Check status\nlaunchctl list | grep ccm\n```\n\n### Other Commands\n\n```bash\n# Show version\nccm --version\n\n# Show help\nccm --help\n```\n\n## Supported Features\n\n- ✅ Full Anthropic API compatibility (`/v1/messages`)\n- ✅ Token counting endpoint (`/v1/messages/count_tokens`)\n- ✅ Extended thinking (Plan Mode support)\n- ✅ **Streaming responses** (SSE format)\n- ✅ System prompts (string and array formats)\n- ✅ Tool calling\n- ✅ Vision (image inputs)\n- ✅ **Auto-mapping** with regex patterns\n- ✅ **Provider failover** with priority-based routing\n- ✅ Auto-strip incompatible parameters for OpenAI models\n\n## Troubleshooting\n\n### Check if server is running\n```bash\ncurl http://127.0.0.1:13456/api/config/json\n```\n\n### Enable debug logging\nSet environment variable:\n```bash\nRUST_LOG=debug ccm start\n```\n\nOr update your config file (`~/.claude-code-mux/config.toml`):\n```toml\n[server]\nlog_level = \"debug\"\n```\n\n### Test routing directly\n```bash\ncurl -X POST http://127.0.0.1:13456/v1/messages \\\n  -H \"Content-Type: application/json\" \\\n  -H \"anthropic-version: 2023-06-01\" \\\n  -d '{\n    \"model\": \"minimax-m2\",\n    \"max_tokens\": 100,\n    \"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}]\n  }'\n```\n\n### View real-time logs\n```bash\n# If running with RUST_LOG\nRUST_LOG=info ccm start\n\n# Check system logs\ntail -f ~/.claude-code-mux/ccm.log\n```\n\n## Performance\n\n- **Memory**: ~6MB RAM (vs ~156MB for Node.js routers) - **25x more efficient**\n- **Startup**: \u003c100ms cold start\n- **Routing**: \u003c1ms overhead per request\n- **Throughput**: Handles 1000+ req/s on modern hardware\n- **Streaming**: Zero-copy SSE streaming with minimal latency\n\n## FAQ\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eDoes it work with my existing Claude Code setup?\u003c/b\u003e\u003c/summary\u003e\n\nYes! Just set two environment variables:\n```bash\nexport ANTHROPIC_BASE_URL=\"http://127.0.0.1:13456\"\nexport ANTHROPIC_API_KEY=\"any-string\"\nclaude\n```\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eWhat happens if all providers fail?\u003c/b\u003e\u003c/summary\u003e\n\nThe proxy returns an error response with details about the failover chain and which providers were attempted. Check the logs for debugging information.\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eCan I use this with Claude Pro/Max, ChatGPT Plus/Pro, or Google AI Pro/Ultra subscription?\u003c/b\u003e\u003c/summary\u003e\n\nYes! Claude Code Mux supports OAuth 2.0 authentication for all three providers:\n- **Claude Pro/Max**: Providers tab → Add Provider → Select \"Anthropic\" → Choose \"OAuth (Claude Pro/Max)\"\n- **ChatGPT Plus/Pro**: Providers tab → Add Provider → Select \"OpenAI\" → Choose \"OAuth (ChatGPT Plus/Pro)\"\n- **Google AI Pro/Ultra**: Providers tab → Add Provider → Select \"Google Gemini\" → Choose \"OAuth (Google AI Pro/Ultra)\"\n\nAll three provide **FREE unlimited API access** to subscribers!\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eHow do I add a new AI provider?\u003c/b\u003e\u003c/summary\u003e\n\n1. Navigate to the **Providers** tab in the admin UI\n2. Click **\"Add Provider\"**\n3. Select provider type (Anthropic-compatible or OpenAI-compatible)\n4. Enter provider name, API key, and base URL\n5. Click **\"Add Provider\"**\n6. Click **\"Save to Server\"**\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eWhy is my routing not working as expected?\u003c/b\u003e\u003c/summary\u003e\n\nCheck the routing order:\n1. **WebSearch** (highest priority) - if request has `web_search` tool\n2. **Subagent** - if system prompt contains `\u003cCCM-SUBAGENT-MODEL\u003e` tag\n3. **Think Mode** - if request has `thinking` field\n4. **Background** - if ORIGINAL model name matches background regex\n5. **Default** - fallback\n\nEnable debug logging with `RUST_LOG=debug ccm start` to see routing decisions.\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eHow do I report bugs or request features?\u003c/b\u003e\u003c/summary\u003e\n\n- **Bug reports**: [Open a GitHub issue](https://github.com/9j/claude-code-mux/issues/new)\n- **Feature requests**: [Start a discussion](https://github.com/9j/claude-code-mux/discussions)\n- **Security issues**: Email the maintainer (see GitHub profile)\n\u003c/details\u003e\n\n## Why Choose Claude Code Mux?\n\n### 🎯 Two Core Advantages\n\n#### 1. **Automatic Failover** 🔄\nPriority-based provider fallback - if your primary provider fails, automatically route to backup:\n\n```toml\n[[models]]\nname = \"glm-4.6\"\n\n[[models.mappings]]\nactual_model = \"glm-4.6\"\npriority = 1\nprovider = \"zai\"\n\n[[models.mappings]]\nactual_model = \"z-ai/glm-4.6\"\npriority = 2\nprovider = \"openrouter\"\n```\n\nIf `zai` fails → automatically falls back to `openrouter`. **No manual intervention needed.**\n\n\u003e **💡 Why This Matters**: Claude Code Router doesn't have failover - if a provider goes down, your workflow stops. With Claude Code Mux, you get uninterrupted coding even during provider outages.\n\n#### 2. **Simpler \u0026 More Efficient** ⚡️\n\n| Feature | Claude Code Router | Claude Code Mux |\n|---------|-------------------|----------------|\n| **UI Access** | `ccr ui` (separate launch) | Built-in at `http://localhost:13456` |\n| **Config Format** | JSON + Transformers | TOML (simpler) |\n| **Memory Usage** | ~156MB (Node.js) | ~6MB (Rust) - **25x lighter** |\n| **Failover** | ❌ Not supported | ✅ Priority-based automatic failover |\n| **Claude Pro/Max** | API Key only | ✅ OAuth 2.0 supported |\n| **Router Auto-save** | Manual save only | Auto-saves to localStorage |\n| **Config Sharing** | Share JSON file | Share URL (`?tab=router`) |\n\n### 💡 What This Means\n\n**Reliability**: Automatic failover keeps you coding when providers go down. (CCR lacks this)\n\n**Faster Setup**: Built-in UI (no `ccr ui` needed) + simpler TOML config.\n\n**Performance**: 25x more memory efficient (6MB vs 156MB).\n\n**Claude Pro/Max Compatible**: OAuth 2.0 authentication supported (CCR requires API key only).\n\n**Simplicity**: TOML is easier than JSON with complex transformer configurations.\n\n## Documentation\n\n- [Design Principles](docs/design-principles.md) - Claude Code Mux design philosophy and UX guidelines\n- [URL-based State Management](docs/url-state-management.md) - Admin UI URL-based state management pattern\n- [LocalStorage-based State Management](docs/localstorage-state-management.md) - Admin UI localStorage-based client state management\n\n## Changelog\n\nSee [CHANGELOG.md](CHANGELOG.md) for detailed release history or view [GitHub Releases](https://github.com/9j/claude-code-mux/releases) for downloads.\n\n## Contributing\n\nWe love contributions! Here's how you can help:\n\n### 🐛 Report Bugs\nFound a bug? [Open an issue](https://github.com/9j/claude-code-mux/issues/new) with:\n- Clear description of the problem\n- Steps to reproduce\n- Expected vs actual behavior\n- Your environment (OS, Rust version)\n\n### 💡 Suggest Features\nHave an idea? [Start a discussion](https://github.com/9j/claude-code-mux/discussions) or open an issue with:\n- Use case description\n- Proposed solution\n- Alternative approaches considered\n\n### 🔧 Submit Pull Requests\n1. Fork the repository\n2. Create a feature branch (`git checkout -b feature/amazing-feature`)\n3. Make your changes\n4. Run tests: `cargo test`\n5. Run formatting: `cargo fmt`\n6. Run linting: `cargo clippy`\n7. Commit with clear message\n8. Push and create a Pull Request\n\n### 📝 Improve Documentation\n- Fix typos or unclear explanations\n- Add examples or use cases\n- Translate docs to other languages\n- Create tutorials or guides\n\n### 🌟 Support the Project\n- Star the repo on GitHub\n- Share with others who might benefit\n- Write blog posts or create videos\n- Join discussions and help other users\n\nSee [CONTRIBUTING.md](CONTRIBUTING.md) for detailed guidelines.\n\n## License\n\nMIT License - see [LICENSE](LICENSE)\n\n## Acknowledgments\n\n- [claude-code-router](https://github.com/musistudio/claude-code-router) - Original TypeScript implementation inspiration\n- [Anthropic](https://anthropic.com) - Claude API\n- Rust community for amazing tools and libraries\n\n---\n\n**Made with ⚡️ in Rust**\n","funding_links":[],"categories":["\u003ca name=\"Rust\"\u003e\u003c/a\u003eRust"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2F9j%2Fclaude-code-mux","html_url":"https://awesome.ecosyste.ms/projects/github.com%2F9j%2Fclaude-code-mux","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2F9j%2Fclaude-code-mux/lists"}