{"id":13675990,"url":"https://github.com/olrea/openai-cpp","last_synced_at":"2025-04-28T23:31:32.192Z","repository":{"id":65366917,"uuid":"590851637","full_name":"olrea/openai-cpp","owner":"olrea","description":"OpenAI C++ is a community-maintained library for the Open AI API","archived":false,"fork":false,"pushed_at":"2024-04-14T12:59:47.000Z","size":359,"stargazers_count":187,"open_issues_count":10,"forks_count":65,"subscribers_count":13,"default_branch":"main","last_synced_at":"2024-08-03T12:16:11.887Z","etag":null,"topics":["artificial-intelligence","chatgpt","chatgpt-api","chatgpt3","cpp","cpp11","machine-learning","openai","openai-api"],"latest_commit_sha":null,"homepage":"https://openai.com/api/","language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/olrea.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.md","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null}},"created_at":"2023-01-19T11:12:43.000Z","updated_at":"2024-08-02T07:39:13.000Z","dependencies_parsed_at":"2024-01-14T14:31:19.841Z","dependency_job_id":"b6ff50a1-6ff6-46ce-97d1-51c77354da05","html_url":"https://github.com/olrea/openai-cpp","commit_stats":null,"previous_names":[],"tags_count":4,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/olrea%2Fopenai-cpp","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/olrea%2Fopenai-cpp/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/olrea%2Fopenai-cpp/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/olrea%2Fopenai-cpp/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/olrea","download_url":"https://codeload.github.com/olrea/openai-cpp/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":224137308,"owners_count":17261993,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["artificial-intelligence","chatgpt","chatgpt-api","chatgpt3","cpp","cpp11","machine-learning","openai","openai-api"],"created_at":"2024-08-02T12:01:07.474Z","updated_at":"2024-11-11T16:31:08.849Z","avatar_url":"https://github.com/olrea.png","language":"C++","readme":"# OpenAI C++ library \n\n[![Language](https://img.shields.io/badge/language-C++-blue.svg)](https://isocpp.org/)  [![Standard](https://img.shields.io/badge/c%2B%2B-11-blue.svg)](https://en.wikipedia.org/wiki/C%2B%2B#Standardization) [![License](https://img.shields.io/github/license/mashape/apistatus.svg)](https://opensource.org/licenses/MIT) ![Github worflow](https://github.com/olrea/openai-cpp/actions/workflows/cmake.yml/badge.svg)\n [![GitHub version](https://badge.fury.io/gh/olrea%2Fopenai-cpp.svg)](https://github.com/olrea/openai-cpp/releases) \n\n## A lightweight header only modern C++ library\n\nOpenAI-C++ library is a **community-maintained** library which provides convenient access to the [OpenAI API](https://openai.com/api/) from applications written in the C++ language.  \nThe library is small with two header files (only one if you already use Nlohmann Json).\n\n## Requirements\n\nNo special requirement. You should already have these :\n\n+ C++11/C++14/C++17/C++20 compatible compiler\n+ [libcurl](https://curl.se/libcurl/) (check [Install curl](https://everything.curl.dev/get) to make sure you have the development package)\n\n## OpenAI C++ current implementation\n\nThe library should implement all requests on [OpenAI references](https://platform.openai.com/docs/api-reference). If any are missing (due to an update), feel free to open an issue.\n\n| API reference | Method | Example file |\n| --- | --- | --- |\n| API models | [List models](https://platform.openai.com/docs/api-reference/models/list) ✅ | [1-model.cpp](examples/01-model.cpp) |\n| API models | [Retrieve model](https://platform.openai.com/docs/api-reference/models/retrieve) ✅ | [1-model.cpp](examples/01-model.cpp) |\n| API completions | [Create completion](https://platform.openai.com/docs/api-reference/completions/create) ✅ | [2-completion.cpp](examples/02-completion.cpp) |\n| API edits | [Create completion](https://platform.openai.com/docs/api-reference/completions/create) | [3-edit.cpp](examples/03-edit.cpp) |\n| API images | [Create image](https://platform.openai.com/docs/api-reference/images) ✅ | [4-image.cpp](examples/04-image.cpp) |\n| API images | [Create image edit](https://platform.openai.com/docs/api-reference/images/create-edit) ✅ | [4-image.cpp](examples/04-image.cpp) |\n| API images | [Create image variation](https://platform.openai.com/docs/api-reference/images/create-variation) ✅ | [4-image.cpp](examples/04-image.cpp) |\n| API embeddings | [Create embeddings](https://platform.openai.com/docs/api-reference/embeddings/create) ✅ | [5-embedding.cpp](examples/05-embedding.cpp) |\n| API files | [List file](https://platform.openai.com/docs/api-reference/files/list) ✅ | [6-file.cpp](examples/06-file.cpp) |\n| API files | [Upload file](https://platform.openai.com/docs/api-reference/files/upload) ✅ | [6-file.cpp](examples/06-file.cpp) |\n| API files | [Delete file](https://platform.openai.com/docs/api-reference/files/delete) ✅ | [6-file.cpp](examples/06-file.cpp) |\n| API files | [Retrieve file](https://platform.openai.com/docs/api-reference/files/retrieve) ✅ | [6-file.cpp](examples/06-file.cpp) |\n| API files | [Retrieve file content](https://platform.openai.com/docs/api-reference/files/retrieve-content) ✅ | [6-file.cpp](examples/06-file.cpp) |\n| API fine-tunes | [Create fine-tune](https://platform.openai.com/docs/api-reference/fine-tunes/create) ✅ | [7-fine-tune.cpp](examples/07-fine-tune.cpp) |\n| API fine-tunes | [List fine-tune](https://platform.openai.com/docs/api-reference/fine-tunes/list) ✅ | [7-fine-tune.cpp](examples/07-fine-tune.cpp) |\n| API fine-tunes | [Retrieve fine-tune](https://platform.openai.com/docs/api-reference/fine-tunes/retrieve) ✅ | [7-fine-tune.cpp](examples/07-fine-tune.cpp) |\n| API fine-tunes | [Cancel fine-tune](https://platform.openai.com/docs/api-reference/fine-tunes/cancel) ✅ | [7-fine-tune.cpp](examples/07-fine-tune.cpp) |\n| API fine-tunes | [List fine-tune events](https://platform.openai.com/docs/api-reference/fine-tunes/events) ✅ | [7-fine-tune.cpp](examples/07-fine-tune.cpp) |\n| API fine-tunes | [Delete fine-tune model](https://platform.openai.com/docs/api-reference/fine-tunes/delete-model) ✅ | [7-fine-tune.cpp](examples/07-fine-tune.cpp) |\n| API chat | [Create chat completion](https://platform.openai.com/docs/api-reference/chat/create) ✅ | [10-chat.cpp](examples/10-chat.cpp) |\n| API audio | [Create transcription](https://platform.openai.com/docs/api-reference/audio/create) ✅ | [11-audio.cpp](examples/11-audio.cpp) |\n| API audio | [Create translation](https://platform.openai.com/docs/api-reference/audio/create) ✅ | [11-audio.cpp](examples/11-audio.cpp) |\n| API moderation | [Create moderation](https://platform.openai.com/docs/api-reference/moderations/create) ✅ | [12-moderation.cpp](examples/11-moderation.cpp) |\n\n## Installation\n\nThe library consists of two files: [include/openai/openai.hpp](https://github.com/olrea/openai-cpp/blob/main/include/openai/openai.hpp) and [include/openai/nlohmann/json.hpp](https://github.com/olrea/openai-cpp/blob/main/include/openai/nlohmann/json.hpp).  \nJust copy the [include/openaicpp](https://github.com/olrea/openai-cpp/tree/main/include/openai) folder in your project and you can use `#include \"openai.hpp\"` in your code. That is all.  \n\n\u003e Note: **OpenAI-CPP** uses [Nlohmann Json](https://github.com/nlohmann/json) which is available in `include/json.hpp`. Feel free to use your own copy for faster compile time build. \n\n## Usage\n\n### Simple showcase\n\nThe library needs to be configured with your account's secret key which is available on the [website](https://platform.openai.com/account/api-keys). It is recommended to set your `OPENAI_API_KEY` environment variable before using the library (or you can also set the API key directly in the code):\n\n```bash\nexport OPENAI_API_KEY='sk-...'\n```\n\nThe following code is available at [examples/00-showcase.cpp](examples/00-showcase.cpp).\n\n```cpp\n#include \"openai.hpp\"\n#include \u003ciostream\u003e\n\nint main() {\n    openai::start(); // Will use the api key provided by `OPENAI_API_KEY` environment variable\n    // openai::start(\"your_API_key\", \"optional_organization\"); // Or you can handle it yourself\n\n    auto completion = openai::completion().create(R\"({\n        \"model\": \"text-davinci-003\",\n        \"prompt\": \"Say this is a test\",\n        \"max_tokens\": 7,\n        \"temperature\": 0\n    })\"_json); // Using user-defined (raw) string literals\n    std::cout \u003c\u003c \"Response is:\\n\" \u003c\u003c completion.dump(2) \u003c\u003c '\\n'; \n\n    auto image = openai::image().create({\n        { \"prompt\", \"A cute koala playing the violin\"},\n        { \"n\", 1 },\n        { \"size\", \"512x512\" }\n    }); // Using initializer lists\n    std::cout \u003c\u003c \"Image URL is: \" \u003c\u003c image[\"data\"][0][\"url\"] \u003c\u003c '\\n'; \n}\n```\n\nThe output received looks like:\n\n```bash\n\u003e\u003e request: https://api.openai.com/v1/completions  {\"max_tokens\":7,\"model\":\"text-davinci-003\",\"prompt\":\"Say this is a test\",\"temperature\":0}\nResponse is:\n{\n  \"choices\": [\n    {\n      \"finish_reason\": \"length\",\n      \"index\": 0,\n      \"logprobs\": null,\n      \"text\": \"\\n\\nThis is indeed a test\"\n    }\n  ],\n  \"created\": 1674121840,\n  \"id\": \"cmpl-6aLr6jPhtxpLyu9rNsJFKDHU3SHpe\",\n  \"model\": \"text-davinci-003\",\n  \"object\": \"text_completion\",\n  \"usage\": {\n    \"completion_tokens\": 7,\n    \"prompt_tokens\": 5,\n    \"total_tokens\": 12\n  }\n}\n\u003e\u003e request: https://api.openai.com/v1/images/generations  {\"n\":1,\"prompt\":\"A cute koala playing the violin\",\"size\":\"512x512\"}\nImage URL is: \"https://oaidalleapiprodscus.blob.core.windows.net/private/org-WaIMDdGHNwJiXAmjegDHE6AM/user-bCrYDjR21ly46316ZbdgqvKf/img-sysAePXF2c8yu28AIoZLLmEG.png?st=2023-01-19T20%3A35%3A19Z\u0026se=2023-01-19T22%3A35%3A19Z\u0026sp=r\u0026sv=2021-08-06\u0026sr=b\u0026rscd=inline\u0026rsct=image/png\u0026skoid=6aaadede-4fb3-4698-a8f6-684d7786b067\u0026sktid=a48cca56-e6da-484e-a814-9c849652bcb3\u0026skt=2023-01-19T18%3A10%3A41Z\u0026ske=2023-01-20T18%3A10%3A41Z\u0026sks=b\u0026skv=2021-08-06\u0026sig=nWkcGTTCsWigHHocYP%2BsyiV5FJL6izpAe3OVvX1GLuI%3D\"\n```\n\n![OpenAI-CPP attachments](doc/koala_violin.png?raw=true \"OpenAI-CPP attachments\")\n\nSince `Openai::Json` is a typedef to a [nlohmann::json](https://github.com/nlohmann/json), you get all the features provided by the latter one (conversions, STL like access, ...). \n\n\n### Build the examples\n\n```bash\nmkdir build \u0026\u0026 cd build\ncmake .. \u0026\u0026 make\nexamples/[whatever]\n```\n\nIn your project, if you want to get verbose output like when running the examples, you can define `#define OPENAI_VERBOSE_OUTPUT`.\n\n### Advanced usage\n\n#### A word about error handling\n\nBy default, **OpenAI-CPP** will throw a runtime error exception if the curl request does not succeed. You are free to handle these exceptions the way you like.\nYou can prevent throw exceptions by setting `setThrowException(false)` (see example in [examples/09-instances.cpp](examples/09-instances.cpp)). If you do that, a warning will be displayed instead. \n\n### More control\n\nYou can use the `openai::post()` or `openai::get()` methods to fully control what you are sending (e.g. can be useful when a new method from OpenAI API is available and not provided by `OpenAI-CPP` yet).\n\n\n#### Manage OpenAI-CPP instance\n\nHere are two approaches to keep alive the **OpenAI-CPP** session in your program so you can use it anytime, anywhere.\n\n##### Use the default instance()\n\nThis is the default behavior. **OpenAI-CPP** provides free convenient functions : `openai::start(const std::string\u0026 token)` and `openai::instance()`.\nInitialize and configure the **OpenAI-CPP** instance with:\n\n```cpp\nauto\u0026 openai = openai::start();\n```\n\nWhen you are in another scope and you have lost the `openai` reference, you can grab it again with :  \n\n```cpp\nauto\u0026 openai = openai::instance();\n```\n\nIt might not be the recommended way but since we generally want to handle only one OpenAI instance (one token), this approach is highly convenient. \n\n##### Pass by reference if you want to manage multiple secret keys\n\nAn other approach is to pass the *OpenAI* instance by reference, store it, and call the appropriate methods when needed.\n\n```cpp\nvoid bar(openai::OpenAI\u0026 openai) {\n    openai.completion.create({\n        {\"model\", \"text-davinci-003\"},\n        {\"prompt\", \"Say bar() function called\"}\n    });\n}\n\nint main() {\n    openai::OpenAI openai_instance{\"your_api_key\"};\n    bar(openai_instance);\n}\n```\n\nYou can use a [std::reference_wrapper](http://en.cppreference.com/w/cpp/utility/functional/reference_wrapper) as shown in [examples/09-instances.cpp](examples/09-instances.cpp). \n\nThis strategy is useful if you have to manage several OpenAI-CPP instances with different secret keys.\n\n## Troubleshooting\n\n### Libcurl with Windows\n\n\u003e Note: If you are using [WSL](https://learn.microsoft.com/windows/wsl/) then you are not concerned by the following. \n\nAccording to [Install Curl on Windows](https://everything.curl.dev/get/windows),\n\u003e Windows 10 comes with the curl tool bundled with the operating system since version 1804\n\nHowever, you still might have difficulties handling libcurl where CMake throws `Could NOT find CURL (missing: CURL_LIBRARY CURL_INCLUDE_DIR)`.  \nYou can try to follow one the 2 ways proposed by the the Curl [Install Curl on Windows](https://everything.curl.dev/get/windows).\n\nAnother way to solve this is to grab the curl version for Windows [here](https://curl.se/windows/), copy the content of `include`\nin appropriate folders available visible in your PATH (e.g. if in your Git installation `[...]/Git/mingw64/include/`).\nYou also need to grab the `curl.lib` and the `libcurl.dll` files from [here](https://dl.dropboxusercontent.com/s/jxwohqax4e2avyt/libcurl-7.48.0-WinSSL-zlib-x86-x64.zip?dl=0) and copy them in appropriate folders (e.g. if in your Git installation `[...]/Git/mingw64/lib/`).\n\n```bash\nmkdir build \u0026\u0026 cd build\ncmake .. -DCMAKE_GENERATOR_PLATFORM=x64\ncmake --build .\ncmake --build . --target 00-showcase # For a specific target\n```\n\nOr if you prefer using GNU GCC on Windows\n\n```bash\ncmake -G \"MSYS Makefiles\" -D CMAKE_CXX_COMPILER=g++ ..\nmake\n```\n\n\n## License\n\n[MIT](LICENSE.md)\n\n\n## Acknowledgment\n\nThis work has been mainly inspired by [slacking](https://github.com/coin-au-carre/slacking) and the curl wrapper code from [cpr](https://github.com/libcpr/cpr).\n\n## Sponsor\n\n[OLREA](https://www.olrea.fr/)\n","funding_links":[],"categories":["twitter","Libraries \u0026 SDKs"],"sub_categories":["Community SDKs"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Folrea%2Fopenai-cpp","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Folrea%2Fopenai-cpp","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Folrea%2Fopenai-cpp/lists"}