{"id":28490941,"url":"https://github.com/tencent-rtc/conversational-ai-cloudbase","last_synced_at":"2026-03-06T06:03:24.288Z","repository":{"id":283269825,"uuid":"946444619","full_name":"Tencent-RTC/conversational-ai-cloudbase","owner":"Tencent-RTC","description":"LLM Services For Conversational AI CloudBase ","archived":false,"fork":false,"pushed_at":"2025-03-19T10:36:13.000Z","size":2263,"stargazers_count":2,"open_issues_count":0,"forks_count":1,"subscribers_count":4,"default_branch":"main","last_synced_at":"2025-07-08T02:14:09.323Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Tencent-RTC.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2025-03-11T06:34:43.000Z","updated_at":"2025-03-19T10:36:17.000Z","dependencies_parsed_at":"2025-03-19T11:51:15.285Z","dependency_job_id":null,"html_url":"https://github.com/Tencent-RTC/conversational-ai-cloudbase","commit_stats":null,"previous_names":["tencent-rtc/conversational-ai-cloudbase"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/Tencent-RTC/conversational-ai-cloudbase","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Tencent-RTC%2Fconversational-ai-cloudbase","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Tencent-RTC%2Fconversational-ai-cloudbase/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Tencent-RTC%2Fconversational-ai-cloudbase/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Tencent-RTC%2Fconversational-ai-cloudbase/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Tencent-RTC","download_url":"https://codeload.github.com/Tencent-RTC/conversational-ai-cloudbase/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Tencent-RTC%2Fconversational-ai-cloudbase/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":30164532,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-06T04:43:31.446Z","status":"ssl_error","status_checked_at":"2026-03-06T04:40:30.133Z","response_time":250,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.5:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-06-08T07:30:33.693Z","updated_at":"2026-03-06T06:03:24.246Z","avatar_url":"https://github.com/Tencent-RTC.png","language":"JavaScript","readme":"# LLM Services For Conversational AI CloudBase\n\n实现了几个专门用于处理 LLM (Large Language Model) 请求的云函数2.0服务，帮助快速接入TRTC-AI对话，每个服务都有其特定的功能和用途。\n\n## 服务概述\n\n### 1. LLM Context Manager\n\n基础的 LLM 对话服务 (`cloudrunfunctions/llm-context-manager/index.mjs`)，提供以下核心功能：\n\n- 基本的对话上下文管理\n- 支持流式响应（SSE）\n- 对话历史记录的存储和清理\n- 可配置的上下文消息数量限制\n- 支持自定义系统提示词\n\n### 2. LLM Fast Reply\n\n快速响应服务 (`cloudrunfunctions/llm-fast-reply/index.mjs`)，在标准 LLM 服务基础上增加了渐进式响应功能：\n\n- 支持小模型快速预响应\n- 使用较小的模型生成初步回复\n- 同时启动大模型生成完整回复\n- 可配置是否启用渐进式响应\n- 支持自定义小模型参数（温度、最大 token 等）\n\n### 3. LLM Tools\n\n集成了工具调用功能的 LLM 服务 (`cloudrunfunctions/llm-tools/index.mjs`)：\n\n- 支持天气查询功能\n- 工具调用的自动处理和响应\n- 支持多轮对话中的工具使用\n- 工具调用结果的上下文整合\n\n### 4. LLM RAG\n\n实现了检索增强生成（RAG）功能的服务 (`cloudrunfunctions/llm-rag/index.mjs`)：\n\n- 文档嵌入和存储\n- 相似度检索\n- 上下文增强\n- 支持文档引用和溯源\n- 可配置的相似度阈值和最大文档数\n\n## 共同特性\n\n所有服务都具备以下基础功能：\n\n- 支持流式输出（SSE）\n- 对话上下文管理\n- 可配置的模型参数\n- 错误处理和日志记录\n- 支持自定义 API 端点和密钥\n- 支持任务 ID 的会话隔离\n\n## 配置项\n\n主要配置参数包括：\n\n- `LLM_BASE_URL`: API 基础 URL\n- `LLM_API_KEY`: API 访问密钥\n- `LLM_MODEL`: 默认使用的模型\n- `MAX_CONTEXT_MESSAGES`: 上下文最大消息数\n- `CONTEXT_EXPIRY_HOURS`: 上下文过期时间\n\n## 使用说明\n\n每个服务都通过 HTTP 请求调用，支持以下参数：\n\n- `messages`: 对话消息数组\n- `model`: 可选的模型指定\n- `taskId`: 会话标识符\n- 特定服务的额外参数（如 RAG 的相似度阈值等）\n\n响应通过 Server-Sent Events (SSE) 流式返回，客户端需要相应处理流式数据。\n\n注意：运行服务前请确保已正确配置环境变量，特别是 `LLM_API_KEY` 和 `LLM_BASE_URL`。\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftencent-rtc%2Fconversational-ai-cloudbase","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ftencent-rtc%2Fconversational-ai-cloudbase","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftencent-rtc%2Fconversational-ai-cloudbase/lists"}