{"id":45700891,"url":"https://github.com/qingchencloud/cj2api","last_synced_at":"2026-03-03T04:01:37.027Z","repository":{"id":340388151,"uuid":"1165821594","full_name":"qingchencloud/cj2api","owner":"qingchencloud","description":"将 ChatJimmy 转换为 OpenAI 兼容 API 的 Cloudflare Worker | 零成本部署，支持流式输出，自带测试页","archived":false,"fork":false,"pushed_at":"2026-02-24T17:19:25.000Z","size":58,"stargazers_count":16,"open_issues_count":1,"forks_count":7,"subscribers_count":0,"default_branch":"main","last_synced_at":"2026-02-28T04:13:00.401Z","etag":null,"topics":["api-proxy","chatjimmy","cloudflare-workers","llm","openai","openai-api","serverless","typescript"],"latest_commit_sha":null,"homepage":"https://cj2api.qt.cool/","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/qingchencloud.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2026-02-24T15:26:49.000Z","updated_at":"2026-02-26T17:47:44.000Z","dependencies_parsed_at":null,"dependency_job_id":null,"html_url":"https://github.com/qingchencloud/cj2api","commit_stats":null,"previous_names":["qingchencloud/cj2api"],"tags_count":1,"template":false,"template_full_name":null,"purl":"pkg:github/qingchencloud/cj2api","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qingchencloud%2Fcj2api","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qingchencloud%2Fcj2api/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qingchencloud%2Fcj2api/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qingchencloud%2Fcj2api/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/qingchencloud","download_url":"https://codeload.github.com/qingchencloud/cj2api/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qingchencloud%2Fcj2api/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":29958378,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-01T01:47:18.291Z","status":"online","status_checked_at":"2026-03-01T02:00:07.437Z","response_time":124,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["api-proxy","chatjimmy","cloudflare-workers","llm","openai","openai-api","serverless","typescript"],"created_at":"2026-02-24T22:09:26.541Z","updated_at":"2026-03-01T02:00:42.047Z","avatar_url":"https://github.com/qingchencloud.png","language":"TypeScript","readme":"# CJ2API\n\n将 [ChatJimmy](https://chatjimmy.ai) 转换为 OpenAI 兼容 API 的 Cloudflare Worker。\n\n一键部署到 Cloudflare Workers，即可获得标准的 `/v1/chat/completions` 接口，兼容所有支持 OpenAI API 的客户端和框架。无需 API Key。\n\n## 特性\n\n- **OpenAI 兼容** — 标准 Chat Completions API 格式，支持流式 (SSE) 和非流式响应\n- **零成本部署** — 运行在 Cloudflare Workers 免费套餐上\n- **自带测试页** — 访问根路径即可在线测试，附带 cURL / Python / Node.js 示例\n- **Token 统计** — 响应中包含 `usage` 字段，测试页实时显示输出速度\n- **极简代码** — 纯 TypeScript，无任何第三方运行时依赖\n\n## 快速开始\n\n### 前置条件\n\n- [Node.js](https://nodejs.org/) 18+\n- [Cloudflare 账号](https://dash.cloudflare.com/sign-up)（免费即可）\n\n### 方式一：从 GitHub 克隆（推荐）\n\n```bash\ngit clone https://github.com/qingchencloud/cj2api.git\ncd cj2api\nnpm install\nnpx wrangler login    # 首次使用需登录 Cloudflare\nnpm run deploy\n```\n\n### 方式二：从 npm 安装\n\n```bash\nnpm install @qingchencloud/cj2api\ncd node_modules/@qingchencloud/cj2api\nnpx wrangler login    # 首次使用需登录 Cloudflare\nnpm run deploy\n```\n\n部署完成后，Wrangler 会输出你的 Worker URL，形如 `https://cj2api.\u003c你的子域\u003e.workers.dev`。\n\n\u003e **国内访问提示：** `*.workers.dev` 域名在国内需要科学上网。如果你有托管在 Cloudflare 的域名，可以在 Dashboard → Workers → cj2api → Settings → Domains \u0026 Routes 中绑定自定义域名，走 CDN 国内可直连。\n\n\u003e **提示：** 如客户端要求填写 API Key，随意输入任意字符串即可。\n\n## API 接口\n\n### POST `/v1/chat/completions`\n\n标准 OpenAI Chat Completions 接口，支持流式和非流式响应。\n\n**请求体：**\n\n```json\n{\n  \"model\": \"llama3.1-8B\",\n  \"messages\": [\n    { \"role\": \"system\", \"content\": \"你是一个有帮助的助手\" },\n    { \"role\": \"user\", \"content\": \"你好\" }\n  ],\n  \"stream\": false,\n  \"top_k\": 8\n}\n```\n\n| 字段 | 类型 | 必填 | 说明 |\n|------|------|------|------|\n| `model` | string | 否 | 模型名称，默认 `llama3.1-8B` |\n| `messages` | array | 是 | 消息列表，支持 `system` / `user` / `assistant` 角色 |\n| `stream` | boolean | 否 | 是否启用流式输出，默认 `false` |\n| `top_k` | number | 否 | Top-K 采样参数，默认 `8` |\n\n**非流式响应：**\n\n```json\n{\n  \"id\": \"chatcmpl-xxxx\",\n  \"object\": \"chat.completion\",\n  \"created\": 1740000000,\n  \"model\": \"llama3.1-8B\",\n  \"choices\": [\n    {\n      \"index\": 0,\n      \"message\": { \"role\": \"assistant\", \"content\": \"你好！有什么可以帮助你的吗？\" },\n      \"finish_reason\": \"stop\"\n    }\n  ],\n  \"usage\": {\n    \"prompt_tokens\": 12,\n    \"completion_tokens\": 85,\n    \"total_tokens\": 97\n  }\n}\n```\n\n**流式响应 (SSE)：**\n\n当 `stream: true` 时，返回 `text/event-stream` 格式：\n\n```\ndata: {\"id\":\"chatcmpl-xxxx\",\"object\":\"chat.completion.chunk\",\"created\":1740000000,\"model\":\"llama3.1-8B\",\"choices\":[{\"index\":0,\"delta\":{\"role\":\"assistant\",\"content\":\"你好\"},\"finish_reason\":null}]}\n\ndata: {\"id\":\"chatcmpl-xxxx\",\"object\":\"chat.completion.chunk\",\"created\":1740000000,\"model\":\"llama3.1-8B\",\"choices\":[{\"index\":0,\"delta\":{\"content\":\"！\"},\"finish_reason\":null}]}\n\ndata: {\"id\":\"chatcmpl-xxxx\",\"object\":\"chat.completion.chunk\",\"created\":1740000000,\"model\":\"llama3.1-8B\",\"choices\":[{\"index\":0,\"delta\":{},\"finish_reason\":\"stop\"}],\"usage\":{\"prompt_tokens\":12,\"completion_tokens\":85,\"total_tokens\":97}}\n\ndata: [DONE]\n```\n\n### GET `/v1/models`\n\n返回可用模型列表。\n\n```json\n{\n  \"object\": \"list\",\n  \"data\": [\n    { \"id\": \"llama3.1-8B\", \"object\": \"model\", \"owned_by\": \"system\" }\n  ]\n}\n```\n\n## 使用示例\n\n### cURL\n\n```bash\ncurl -X POST https://your-domain/v1/chat/completions \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\n  \"model\": \"llama3.1-8B\",\n  \"messages\": [{\"role\": \"user\", \"content\": \"你好\"}],\n  \"stream\": false\n}'\n```\n\n### Python\n\n```python\nimport requests\n\nresp = requests.post(\n    \"https://your-domain/v1/chat/completions\",\n    json={\n        \"model\": \"llama3.1-8B\",\n        \"messages\": [{\"role\": \"user\", \"content\": \"你好\"}],\n        \"stream\": False\n    }\n)\nprint(resp.json()[\"choices\"][0][\"message\"][\"content\"])\n```\n\n### Node.js\n\n```javascript\nconst resp = await fetch(\"https://your-domain/v1/chat/completions\", {\n  method: \"POST\",\n  headers: { \"Content-Type\": \"application/json\" },\n  body: JSON.stringify({\n    model: \"llama3.1-8B\",\n    messages: [{ role: \"user\", content: \"你好\" }],\n    stream: false\n  })\n});\nconst data = await resp.json();\nconsole.log(data.choices[0].message.content);\n```\n\n### OpenAI SDK（Python）\n\n完全兼容 OpenAI API 格式，可以直接使用官方 SDK：\n\n```python\nfrom openai import OpenAI\n\nclient = OpenAI(\n    base_url=\"https://your-domain/v1\",\n    api_key=\"any-string\"  # 无需真实 API Key\n)\n\nresponse = client.chat.completions.create(\n    model=\"llama3.1-8B\",\n    messages=[{\"role\": \"user\", \"content\": \"你好\"}]\n)\nprint(response.choices[0].message.content)\n```\n\n## 本地开发\n\n```bash\ngit clone https://github.com/qingchencloud/cj2api.git\ncd cj2api\nnpm install\nnpm run dev\n# 默认监听 http://localhost:8787\n```\n\n本地开发时，可以搭配 [cftunnel](https://github.com/qingchencloud/cftunnel) 将本地服务暴露到公网，方便远程调试或分享给他人测试：\n\n```bash\n# 另开终端，生成临时公网地址\ncftunnel quick 8787\n# 输出类似: https://xxx-xxx-xxx.trycloudflare.com\n```\n\n如需绑定自有域名：\n\n```bash\ncftunnel init\ncftunnel create my-api\ncftunnel add api 8787 --domain api.example.com\ncftunnel up\n# 通过 https://api.example.com/v1/chat/completions 稳定访问\n```\n\n## 工作原理\n\n```\n客户端 (OpenAI SDK / curl / 任意 HTTP)\n  │\n  │  POST /v1/chat/completions\n  │  标准 OpenAI 请求格式\n  ▼\n┌─────────────────────────┐\n│   Cloudflare Worker      │\n│                         │\n│  1. 解析请求体           │\n│  2. 提取 system 消息     │\n│  3. 转换为上游格式       │\n│  4. 转发到 ChatJimmy     │\n│  5. 解析响应 + stats     │\n│  6. 封装为 OpenAI 格式   │\n└─────────────────────────┘\n  │\n  │  ChatJimmy 私有协议\n  ▼\n┌─────────────────────────┐\n│   chatjimmy.ai/api/chat │\n│   返回纯文本 + stats 块  │\n└─────────────────────────┘\n```\n\n### 协议转换细节\n\n**请求转换：** 客户端发送标准 OpenAI 格式，Worker 将其转换为 ChatJimmy 的私有格式：\n\n- `messages` 中的 `system` 角色消息被提取为 `chatOptions.systemPrompt`\n- `model` 映射到 `chatOptions.selectedModel`\n- `top_k` 映射到 `chatOptions.topK`\n\n**响应解析：** ChatJimmy 返回纯文本，末尾附带统计块：\n\n```\n这是回复内容...\u003c|stats|\u003e{\"prefill_tokens\":12,\"decode_tokens\":85,\"total_tokens\":97}\u003c|/stats|\u003e\n```\n\nWorker 解析出内容和统计信息，封装为标准 OpenAI 响应格式。\n\n**模拟流式输出：** 上游不支持真正的 SSE 流式，Worker 采用\"伪流式\"策略 — 先获取完整响应，再将内容按自然断点（空格、标点、换行）拆分为小块，逐块以 SSE `data:` 事件推送给客户端。\n\n## 项目结构\n\n```\ncj2api/\n├── src/\n│   ├── index.ts        # 入口，路由分发\n│   ├── chat.ts         # Chat Completions 处理（流式/非流式）\n│   ├── models.ts       # 模型列表端点\n│   ├── upstream.ts     # 上游 ChatJimmy 请求转换\n│   ├── page.ts         # 内置测试页面\n│   ├── types.ts        # TypeScript 类型定义\n│   └── utils.ts        # 工具函数（ID生成、响应解析等）\n├── .claude/\n│   └── skills/             # Claude Code 维护 Skills\n│       ├── release/SKILL.md    # 发版流程\n│       ├── deploy/SKILL.md     # 部署流程\n│       └── update-page/SKILL.md # 测试页面维护\n├── wrangler.toml       # Cloudflare Workers 配置\n├── tsconfig.json       # TypeScript 配置\n├── package.json\n└── README.md\n```\n\n## Claude Code Skills\n\n本项目内置了 [Claude Code](https://docs.anthropic.com/en/docs/claude-code) 维护 Skills，方便开发者通过 AI 辅助进行项目维护：\n\n| Skill | 说明 |\n|-------|------|\n| `/release` | 版本发布：bump 版本号 → npm publish → git tag → push |\n| `/deploy` | 部署到 Cloudflare Workers |\n| `/update-page` | 维护内置测试页面（page.ts） |\n\n在 Claude Code 中打开本项目目录，输入对应 Skill 名称即可使用。\n\n## 免责声明\n\n本项目仅供**学习研究和技术测试**使用，请勿用于任何商业用途。\n\n- 本项目是对 [ChatJimmy](https://chatjimmy.ai) 公开接口的协议转换封装，不提供任何模型能力本身\n- 使用者应遵守 ChatJimmy 的服务条款和使用政策\n- 请勿将本项目用于大规模请求、自动化爬取或任何可能对上游服务造成负担的行为\n- 上游服务的可用性、响应质量和模型能力均由 ChatJimmy 提供，与本项目无关\n- 作者不对因使用本项目产生的任何直接或间接损失承担责任\n- 如上游服务条款发生变更导致本项目不可用，作者不承担任何义务\n\n## License\n\n[MIT](LICENSE) © QingChen Cloud\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fqingchencloud%2Fcj2api","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fqingchencloud%2Fcj2api","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fqingchencloud%2Fcj2api/lists"}