{"id":32499382,"url":"https://github.com/clickhouse/ai-sdk-cpp","last_synced_at":"2025-10-27T15:53:05.126Z","repository":{"id":301637076,"uuid":"1006216723","full_name":"ClickHouse/ai-sdk-cpp","owner":"ClickHouse","description":"The AI Toolkit for Modern C++. From the engineers at ClickHouse, ai-sdk-cpp is a free, open‑source library for building AI‑powered applications and agents.","archived":false,"fork":false,"pushed_at":"2025-10-16T23:41:06.000Z","size":604,"stargazers_count":98,"open_issues_count":2,"forks_count":8,"subscribers_count":4,"default_branch":"main","last_synced_at":"2025-10-18T02:45:46.298Z","etag":null,"topics":["anthropic","artificial-intelligence","cpp","gemini","generative-ai","llm","openai"],"latest_commit_sha":null,"homepage":"","language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ClickHouse.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2025-06-21T18:50:48.000Z","updated_at":"2025-10-17T12:56:46.000Z","dependencies_parsed_at":"2025-06-29T13:36:56.181Z","dependency_job_id":"81a7fd89-b7bc-413f-9392-5fa9f4ea1cac","html_url":"https://github.com/ClickHouse/ai-sdk-cpp","commit_stats":null,"previous_names":["iskakaushik/ai-sdk-cpp","clickhouse/ai-sdk-cpp"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/ClickHouse/ai-sdk-cpp","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ClickHouse%2Fai-sdk-cpp","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ClickHouse%2Fai-sdk-cpp/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ClickHouse%2Fai-sdk-cpp/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ClickHouse%2Fai-sdk-cpp/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ClickHouse","download_url":"https://codeload.github.com/ClickHouse/ai-sdk-cpp/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ClickHouse%2Fai-sdk-cpp/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":281295811,"owners_count":26476759,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-27T02:00:05.855Z","response_time":61,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["anthropic","artificial-intelligence","cpp","gemini","generative-ai","llm","openai"],"created_at":"2025-10-27T15:53:02.356Z","updated_at":"2025-10-27T15:53:05.114Z","avatar_url":"https://github.com/ClickHouse.png","language":"C++","readme":"# AI SDK CPP\n\nThe AI SDK CPP is a modern C++ toolkit designed to help you build AI-powered applications with popular model providers like OpenAI and Anthropic. It provides a unified, easy-to-use API that abstracts away the complexity of different provider implementations.\n\n## Motivation\n\nC++ developers have long lacked a first-class, convenient way to interact with modern AI services like OpenAI, Anthropic, and others. AI SDK CPP bridges this gap by providing:\n\n- **Unified API**: Work with multiple AI providers through a single, consistent interface\n- **Modern C++**: Built with C++20 features for clean, expressive code\n- **Minimal Dependencies**: Minimal external dependencies for easy integration\n\n## Installation\n\nYou will need a C++20 compatible compiler and CMake 3.16+ installed on your development machine.\n\n## Usage\n\n### Core API\n\nThe AI SDK CPP Core module provides a unified API to interact with model providers like OpenAI and Anthropic.\n\n#### OpenAI Integration\n\n```cpp\n#include \u003cai/openai.h\u003e\n#include \u003cai/generate.h\u003e\n#include \u003ciostream\u003e\n\nint main() {\n    // Ensure OPENAI_API_KEY environment variable is set\n    auto client = ai::openai::create_client();\n    \n    auto result = client.generate_text({\n        .model = ai::openai::models::kGpt4o, // this can also be a string like \"gpt-4o\"\n        .system = \"You are a friendly assistant!\",\n        .prompt = \"Why is the sky blue?\"\n    });\n    \n    if (result) {\n        std::cout \u003c\u003c result-\u003etext \u003c\u003c std::endl;\n    }\n    \n    return 0;\n}\n```\n\n#### Anthropic Integration\n\n```cpp\n#include \u003cai/anthropic.h\u003e\n#include \u003cai/generate.h\u003e\n#include \u003ciostream\u003e\n\nint main() {\n    // Ensure ANTHROPIC_API_KEY environment variable is set\n    auto client = ai::anthropic::create_client();\n    auto result = client.generate_text({\n        .model = ai::anthropic::models::kClaude35Sonnet,\n        .system = \"You are a helpful assistant.\",\n        .prompt = \"Explain quantum computing in simple terms.\"\n    });\n    \n    if (result) {\n        std::cout \u003c\u003c result-\u003etext \u003c\u003c std::endl;\n    }\n    \n    return 0;\n}\n```\n\n#### Streaming Responses\n\n```cpp\n#include \u003cai/openai.h\u003e\n#include \u003cai/stream.h\u003e\n#include \u003ciostream\u003e\n\nint main() {\n    auto client = ai::openai::create_client();\n    \n    auto stream = client.stream_text({\n        .model = ai::openai::models::kGpt4o, // this can also be a string like \"gpt-4o\"\n        .system = \"You are a helpful assistant.\",\n        .prompt = \"Write a short story about a robot.\"\n    });\n    \n    for (const auto\u0026 chunk : stream) {\n        if (chunk.text) {\n            std::cout \u003c\u003c chunk.text.value() \u003c\u003c std::flush;\n        }\n    }\n    \n    return 0;\n}\n```\n\n#### Multi-turn Conversations\n\n```cpp\n#include \u003cai/openai.h\u003e\n#include \u003cai/generate.h\u003e\n#include \u003ciostream\u003e\n\nint main() {\n    auto client = ai::openai::create_client();\n    \n    ai::Messages messages = {\n        {\"system\", \"You are a helpful math tutor.\"},\n        {\"user\", \"What is 2 + 2?\"},\n        {\"assistant\", \"2 + 2 equals 4.\"},\n        {\"user\", \"Now what is 4 + 4?\"}\n    };\n    \n    auto result = client.generate_text({\n        .model = ai::openai::models::kGpt4o, // this can also be a string like \"gpt-4o\"\n        .messages = messages\n    });\n    \n    if (result) {\n        std::cout \u003c\u003c result-\u003etext \u003c\u003c std::endl;\n    }\n    \n    return 0;\n}\n```\n\n#### Tool Calling\n\nThe AI SDK CPP supports function calling, allowing models to interact with external systems and APIs.\n\n```cpp\n#include \u003cai/openai.h\u003e\n#include \u003cai/generate.h\u003e\n#include \u003cai/tools.h\u003e\n#include \u003ciostream\u003e\n\n// Define a tool function\nai::JsonValue get_weather(const ai::JsonValue\u0026 args, const ai::ToolExecutionContext\u0026 context) {\n    std::string location = args[\"location\"].get\u003cstd::string\u003e();\n    \n    // Your weather API logic here\n    return ai::JsonValue{\n        {\"location\", location},\n        {\"temperature\", 72},\n        {\"condition\", \"Sunny\"}\n    };\n}\n\nint main() {\n    auto client = ai::openai::create_client();\n    \n    // Create tools\n    ai::ToolSet tools = {\n        {\"weather\", ai::create_simple_tool(\n            \"weather\",\n            \"Get current weather for a location\", \n            {{\"location\", \"string\"}},\n            get_weather\n        )}\n    };\n    \n    auto result = client.generate_text({\n        .model = ai::openai::models::kGpt4o,\n        .prompt = \"What's the weather like in San Francisco?\",\n        .tools = tools,\n        .max_steps = 3  // Enable multi-step tool calling\n    });\n    \n    if (result) {\n        std::cout \u003c\u003c result-\u003etext \u003c\u003c std::endl;\n        \n        // Inspect tool calls and results\n        for (const auto\u0026 call : result-\u003etool_calls) {\n            std::cout \u003c\u003c \"Tool: \" \u003c\u003c call.tool_name \n                      \u003c\u003c \", Args: \" \u003c\u003c call.arguments.dump() \u003c\u003c std::endl;\n        }\n    }\n    \n    return 0;\n}\n```\n\n#### Async Tool Calling\n\nFor long-running operations, you can define asynchronous tools:\n\n```cpp\n#include \u003cfuture\u003e\n#include \u003cthread\u003e\n#include \u003cchrono\u003e\n\n// Async tool that returns a future\nstd::future\u003cai::JsonValue\u003e fetch_data_async(const ai::JsonValue\u0026 args, const ai::ToolExecutionContext\u0026 context) {\n    return std::async(std::launch::async, [args]() {\n        // Simulate async operation\n        std::this_thread::sleep_for(std::chrono::seconds(1));\n        \n        return ai::JsonValue{\n            {\"data\", \"Fetched from API\"},\n            {\"timestamp\", std::time(nullptr)}\n        };\n    });\n}\n\nint main() {\n    auto client = ai::openai::create_client();\n    \n    ai::ToolSet tools = {\n        {\"fetch_data\", ai::create_simple_async_tool(\n            \"fetch_data\",\n            \"Fetch data from external API\",\n            {{\"endpoint\", \"string\"}},\n            fetch_data_async\n        )}\n    };\n    \n    // Multiple async tools will execute in parallel\n    auto result = client.generate_text({\n        .model = ai::openai::models::kGpt4o,\n        .prompt = \"Fetch data from the user and product APIs\",\n        .tools = tools\n    });\n    \n    return 0;\n}\n```\n\n#### Custom Retry Configuration\n\nConfigure retry behavior for handling transient failures:\n\n```cpp\n#include \u003cai/openai.h\u003e\n#include \u003cai/retry/retry_policy.h\u003e\n\nint main() {\n    // Configure custom retry behavior\n    ai::retry::RetryConfig retry_config;\n    retry_config.max_retries = 5;        // More retries for unreliable networks\n    retry_config.initial_delay = std::chrono::milliseconds(1000);\n    retry_config.backoff_factor = 1.5;   // Gentler backoff\n    \n    // Create client with custom retry configuration\n    auto client = ai::openai::create_client(\n        \"your-api-key\",\n        \"https://api.openai.com\",\n        retry_config\n    );\n    \n    // The client will automatically retry on transient failures:\n    // - Network errors\n    // - HTTP 408, 409, 429 (rate limits), and 5xx errors\n    auto result = client.generate_text({\n        .model = ai::openai::models::kGpt4o,\n        .prompt = \"Hello, world!\"\n    });\n    \n    return 0;\n}\n```\n\n#### Using OpenAI-Compatible APIs (OpenRouter, etc.)\n\nThe OpenAI client can be used with any OpenAI-compatible API by specifying a custom base URL. This allows you to use alternative providers like OpenRouter, which offers access to multiple models through a unified API.\n\n```cpp\n#include \u003cai/openai.h\u003e\n#include \u003cai/generate.h\u003e\n#include \u003ciostream\u003e\n#include \u003ccstdlib\u003e\n\nint main() {\n    // Get API key from environment variable\n    const char* api_key = std::getenv(\"OPENROUTER_API_KEY\");\n    if (!api_key) {\n        std::cerr \u003c\u003c \"Please set OPENROUTER_API_KEY environment variable\\n\";\n        return 1;\n    }\n    \n    // Create client with OpenRouter's base URL\n    auto client = ai::openai::create_client(\n        api_key,\n        \"https://openrouter.ai/api\"  // OpenRouter's OpenAI-compatible endpoint\n    );\n    \n    // Use any model available on OpenRouter\n    auto result = client.generate_text({\n        .model = \"anthropic/claude-3.5-sonnet\",  // or \"meta-llama/llama-3.1-8b-instruct\", etc.\n        .system = \"You are a helpful assistant.\",\n        .prompt = \"What are the benefits of using OpenRouter?\"\n    });\n    \n    if (result) {\n        std::cout \u003c\u003c result-\u003etext \u003c\u003c std::endl;\n    }\n    \n    return 0;\n}\n```\n\nThis approach works with any OpenAI-compatible API provider. Simply provide:\n1. Your provider's API key\n2. The provider's base URL endpoint\n3. Model names as specified by your provider\n\nSee the [OpenRouter example](examples/openrouter_example.cpp) for a complete demonstration.\n\n## Features\n\n### Currently Supported\n\n- ✅ **Text Generation**: Generate text completions with OpenAI and Anthropic models\n- ✅ **Streaming**: Real-time streaming of generated content\n- ✅ **Multi-turn Conversations**: Support for conversation history\n- ✅ **Error Handling**: Comprehensive error handling with optional types\n\n### Recently Added\n\n- ✅ **Tool Calling**: Function calling and tool integration with multi-step support\n- ✅ **Async Tools**: Asynchronous tool execution with parallel processing\n- ✅ **Configurable Retries**: Customizable retry behavior with exponential backoff\n\n### Coming Soon\n\n- 🚧 **Additional Providers**: Google, Cohere, and other providers\n- 🚧 **Embeddings**: Text embedding support\n- 🚧 **Image Generation**: Support for image generation models\n\n## Examples\n\nCheck out our [examples directory](examples/) for more comprehensive usage examples:\n\n- [Basic Chat Application](examples/basic_chat.cpp)\n- [Streaming Chat](examples/streaming_chat.cpp)\n- [Multi-provider Comparison](examples/multi_provider.cpp)\n- [Error Handling](examples/error_handling.cpp)\n- [Retry Configuration](examples/retry_config_example.cpp)\n- [Basic Tool Calling](examples/tool_calling_basic.cpp)\n- [Multi-Step Tool Workflows](examples/tool_calling_multistep.cpp)\n- [Async Tool Execution](examples/tool_calling_async.cpp)\n- [OpenRouter Integration](examples/openrouter_example.cpp) - Using OpenAI-compatible APIs\n\n\n## Requirements\n\n- **C++ Standard**: C++20 or higher\n- **CMake**: 3.16 or higher\n\n## Dependencies and Modifications\n\n### nlohmann/json (Patched)\n\nThis project uses a patched version of nlohmann/json to remove the dependency on `localeconv()`, which is not thread-safe. The patch ensures:\n\n- **Thread Safety**: Eliminates calls to the non-thread-safe `localeconv()` function, allowing downstream users to safely use the library in multi-threaded environments without worrying about locale-related race conditions\n- **Consistent Behavior**: Always uses '.' as the decimal point separator regardless of system locale\n- **Simplified Integration**: Downstream users don't need to implement locale synchronization or worry about thread safety issues\n\nThis modification improves both safety and portability of the JSON library in concurrent applications.\n\n## Acknowledgments\n\nInspired by the excellent [Vercel AI SDK](https://github.com/vercel/ai) for TypeScript/JavaScript developers.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fclickhouse%2Fai-sdk-cpp","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fclickhouse%2Fai-sdk-cpp","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fclickhouse%2Fai-sdk-cpp/lists"}