{"id":21261449,"url":"https://github.com/openai/openai-java","last_synced_at":"2026-03-13T21:02:54.192Z","repository":{"id":263660707,"uuid":"881501535","full_name":"openai/openai-java","owner":"openai","description":"The official Java library for the OpenAI API","archived":false,"fork":false,"pushed_at":"2026-02-28T05:13:50.000Z","size":24001,"stargazers_count":1360,"open_issues_count":52,"forks_count":203,"subscribers_count":25,"default_branch":"main","last_synced_at":"2026-02-28T06:53:42.246Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Kotlin","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/openai.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":".github/CODEOWNERS","security":"SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2024-10-31T17:42:57.000Z","updated_at":"2026-02-27T06:53:09.000Z","dependencies_parsed_at":"2026-01-05T19:09:56.531Z","dependency_job_id":null,"html_url":"https://github.com/openai/openai-java","commit_stats":null,"previous_names":["openai/openai-java"],"tags_count":169,"template":false,"template_full_name":null,"purl":"pkg:github/openai/openai-java","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fopenai-java","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fopenai-java/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fopenai-java/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fopenai-java/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/openai","download_url":"https://codeload.github.com/openai/openai-java/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/openai%2Fopenai-java/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":30091761,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-04T19:41:02.502Z","status":"ssl_error","status_checked_at":"2026-03-04T19:40:05.550Z","response_time":59,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-21T04:43:21.540Z","updated_at":"2026-03-13T21:02:54.184Z","avatar_url":"https://github.com/openai.png","language":"Kotlin","funding_links":[],"categories":["Openai","人工智能","2. Libraries \u0026 Frameworks","Libraries \u0026 SDKs"],"sub_categories":["Java","Official SDKs"],"readme":"# OpenAI Java API Library\n\n\u003c!-- x-release-please-start-version --\u003e\n\n[![Maven Central](https://img.shields.io/maven-central/v/com.openai/openai-java)](https://central.sonatype.com/artifact/com.openai/openai-java/4.27.0)\n[![javadoc](https://javadoc.io/badge2/com.openai/openai-java/4.27.0/javadoc.svg)](https://javadoc.io/doc/com.openai/openai-java/4.27.0)\n\n\u003c!-- x-release-please-end --\u003e\n\nThe OpenAI Java SDK provides convenient access to the [OpenAI REST API](https://platform.openai.com/docs) from applications written in Java.\n\n\u003c!-- x-release-please-start-version --\u003e\n\nThe REST API documentation can be found on [platform.openai.com](https://platform.openai.com/docs). Javadocs are available on [javadoc.io](https://javadoc.io/doc/com.openai/openai-java/4.27.0).\n\n\u003c!-- x-release-please-end --\u003e\n\n## Installation\n\n\u003c!-- x-release-please-start-version --\u003e\n\n[_Try `openai-java-spring-boot-starter` if you're using Spring Boot!_](#spring-boot)\n\n### Gradle\n\n```kotlin\nimplementation(\"com.openai:openai-java:4.27.0\")\n```\n\n### Maven\n\n```xml\n\u003cdependency\u003e\n  \u003cgroupId\u003ecom.openai\u003c/groupId\u003e\n  \u003cartifactId\u003eopenai-java\u003c/artifactId\u003e\n  \u003cversion\u003e4.27.0\u003c/version\u003e\n\u003c/dependency\u003e\n```\n\n\u003c!-- x-release-please-end --\u003e\n\n## Requirements\n\nThis library requires Java 8 or later.\n\n## Usage\n\n\u003e [!TIP]\n\u003e See the [`openai-java-example`](openai-java-example/src/main/java/com/openai/example) directory for complete and runnable examples!\n\nThe primary API for interacting with OpenAI models is the [Responses API](https://platform.openai.com/docs/api-reference/responses). You can generate text from the model with the code below.\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\n// Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID` and `OPENAI_PROJECT_ID` environment variables\nOpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\nResponseCreateParams params = ResponseCreateParams.builder()\n        .input(\"Say this is a test\")\n        .model(ChatModel.GPT_5_2)\n        .build();\nResponse response = client.responses().create(params);\n```\n\nThe previous standard (supported indefinitely) for generating text is the [Chat Completions API](https://platform.openai.com/docs/api-reference/chat). You can use that API to generate text from the model with the code below.\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.ChatCompletion;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\n\n// Configures using the `openai.apiKey`, `openai.orgId`, `openai.projectId`, `openai.webhookSecret` and `openai.baseUrl` system properties\n// Or configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID`, `OPENAI_WEBHOOK_SECRET` and `OPENAI_BASE_URL` environment variables\nOpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\nChatCompletionCreateParams params = ChatCompletionCreateParams.builder()\n    .addUserMessage(\"Say this is a test\")\n    .model(ChatModel.GPT_5_2)\n    .build();\nChatCompletion chatCompletion = client.chat().completions().create(params);\n```\n\n## Client configuration\n\nConfigure the client using system properties or environment variables:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\n\n// Configures using the `openai.apiKey`, `openai.orgId`, `openai.projectId`, `openai.webhookSecret` and `openai.baseUrl` system properties\n// Or configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID`, `OPENAI_WEBHOOK_SECRET` and `OPENAI_BASE_URL` environment variables\nOpenAIClient client = OpenAIOkHttpClient.fromEnv();\n```\n\nOr manually:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    .apiKey(\"My API Key\")\n    .build();\n```\n\nOr using a combination of the two approaches:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    // Configures using the `openai.apiKey`, `openai.orgId`, `openai.projectId`, `openai.webhookSecret` and `openai.baseUrl` system properties\n    // Or configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID`, `OPENAI_WEBHOOK_SECRET` and `OPENAI_BASE_URL` environment variables\n    .fromEnv()\n    .apiKey(\"My API Key\")\n    .build();\n```\n\nSee this table for the available options:\n\n| Setter          | System property        | Environment variable    | Required | Default value                 |\n| --------------- | ---------------------- | ----------------------- | -------- | ----------------------------- |\n| `apiKey`        | `openai.apiKey`        | `OPENAI_API_KEY`        | true     | -                             |\n| `organization`  | `openai.orgId`         | `OPENAI_ORG_ID`         | false    | -                             |\n| `project`       | `openai.projectId`     | `OPENAI_PROJECT_ID`     | false    | -                             |\n| `webhookSecret` | `openai.webhookSecret` | `OPENAI_WEBHOOK_SECRET` | false    | -                             |\n| `baseUrl`       | `openai.baseUrl`       | `OPENAI_BASE_URL`       | true     | `\"https://api.openai.com/v1\"` |\n\nSystem properties take precedence over environment variables.\n\n\u003e [!TIP]\n\u003e Don't create more than one client in the same application. Each client has a connection pool and\n\u003e thread pools, which are more efficient to share between requests.\n\n### Modifying configuration\n\nTo temporarily use a modified client configuration, while reusing the same connection and thread pools, call `withOptions()` on any client or service:\n\n```java\nimport com.openai.client.OpenAIClient;\n\nOpenAIClient clientWithOptions = client.withOptions(optionsBuilder -\u003e {\n    optionsBuilder.baseUrl(\"https://example.com\");\n    optionsBuilder.maxRetries(42);\n});\n```\n\nThe `withOptions()` method does not affect the original client or service.\n\n## Requests and responses\n\nTo send a request to the OpenAI API, build an instance of some `Params` class and pass it to the corresponding client method. When the response is received, it will be deserialized into an instance of a Java class.\n\nFor example, `client.chat().completions().create(...)` should be called with an instance of `ChatCompletionCreateParams`, and it will return an instance of `ChatCompletion`.\n\n## Immutability\n\nEach class in the SDK has an associated [builder](https://blogs.oracle.com/javamagazine/post/exploring-joshua-blochs-builder-design-pattern-in-java) or factory method for constructing it.\n\nEach class is [immutable](https://docs.oracle.com/javase/tutorial/essential/concurrency/immutable.html) once constructed. If the class has an associated builder, then it has a `toBuilder()` method, which can be used to convert it back to a builder for making a modified copy.\n\nBecause each class is immutable, builder modification will _never_ affect already built class instances.\n\n## Asynchronous execution\n\nThe default client is synchronous. To switch to asynchronous execution, call the `async()` method:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.ChatCompletion;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\nimport java.util.concurrent.CompletableFuture;\n\n// Configures using the `openai.apiKey`, `openai.orgId`, `openai.projectId`, `openai.webhookSecret` and `openai.baseUrl` system properties\n// Or configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID`, `OPENAI_WEBHOOK_SECRET` and `OPENAI_BASE_URL` environment variables\nOpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\nChatCompletionCreateParams params = ChatCompletionCreateParams.builder()\n    .addUserMessage(\"Say this is a test\")\n    .model(ChatModel.GPT_5_2)\n    .build();\nCompletableFuture\u003cChatCompletion\u003e chatCompletion = client.async().chat().completions().create(params);\n```\n\nOr create an asynchronous client from the beginning:\n\n```java\nimport com.openai.client.OpenAIClientAsync;\nimport com.openai.client.okhttp.OpenAIOkHttpClientAsync;\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.ChatCompletion;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\nimport java.util.concurrent.CompletableFuture;\n\n// Configures using the `openai.apiKey`, `openai.orgId`, `openai.projectId`, `openai.webhookSecret` and `openai.baseUrl` system properties\n// Or configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID`, `OPENAI_WEBHOOK_SECRET` and `OPENAI_BASE_URL` environment variables\nOpenAIClientAsync client = OpenAIOkHttpClientAsync.fromEnv();\n\nChatCompletionCreateParams params = ChatCompletionCreateParams.builder()\n    .addUserMessage(\"Say this is a test\")\n    .model(ChatModel.GPT_5_2)\n    .build();\nCompletableFuture\u003cChatCompletion\u003e chatCompletion = client.chat().completions().create(params);\n```\n\nThe asynchronous client supports the same options as the synchronous one, except most methods return `CompletableFuture`s.\n\n## Streaming\n\nThe SDK defines methods that return response \"chunk\" streams, where each chunk can be individually processed as soon as it arrives instead of waiting on the full response. Streaming methods generally correspond to [SSE](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events) or [JSONL](https://jsonlines.org) responses.\n\nSome of these methods may have streaming and non-streaming variants, but a streaming method will always have a `Streaming` suffix in its name, even if it doesn't have a non-streaming variant.\n\nThese streaming methods return [`StreamResponse`](openai-java-core/src/main/kotlin/com/openai/core/http/StreamResponse.kt) for synchronous clients:\n\n```java\nimport com.openai.core.http.StreamResponse;\nimport com.openai.models.chat.completions.ChatCompletionChunk;\n\ntry (StreamResponse\u003cChatCompletionChunk\u003e streamResponse = client.chat().completions().createStreaming(params)) {\n    streamResponse.stream().forEach(chunk -\u003e {\n        System.out.println(chunk);\n    });\n    System.out.println(\"No more chunks!\");\n}\n```\n\nOr [`AsyncStreamResponse`](openai-java-core/src/main/kotlin/com/openai/core/http/AsyncStreamResponse.kt) for asynchronous clients:\n\n```java\nimport com.openai.core.http.AsyncStreamResponse;\nimport com.openai.models.chat.completions.ChatCompletionChunk;\nimport java.util.Optional;\n\nclient.async().chat().completions().createStreaming(params).subscribe(chunk -\u003e {\n    System.out.println(chunk);\n});\n\n// If you need to handle errors or completion of the stream\nclient.async().chat().completions().createStreaming(params).subscribe(new AsyncStreamResponse.Handler\u003c\u003e() {\n    @Override\n    public void onNext(ChatCompletionChunk chunk) {\n        System.out.println(chunk);\n    }\n\n    @Override\n    public void onComplete(Optional\u003cThrowable\u003e error) {\n        if (error.isPresent()) {\n            System.out.println(\"Something went wrong!\");\n            throw new RuntimeException(error.get());\n        } else {\n            System.out.println(\"No more chunks!\");\n        }\n    }\n});\n\n// Or use futures\nclient.async().chat().completions().createStreaming(params)\n    .subscribe(chunk -\u003e {\n        System.out.println(chunk);\n    })\n    .onCompleteFuture();\n    .whenComplete((unused, error) -\u003e {\n        if (error != null) {\n            System.out.println(\"Something went wrong!\");\n            throw new RuntimeException(error);\n        } else {\n            System.out.println(\"No more chunks!\");\n        }\n    });\n```\n\nAsync streaming uses a dedicated per-client cached thread pool [`Executor`](https://docs.oracle.com/javase/8/docs/api/java/util/concurrent/Executor.html) to stream without blocking the current thread. This default is suitable for most purposes.\n\nTo use a different `Executor`, configure the subscription using the `executor` parameter:\n\n```java\nimport java.util.concurrent.Executor;\nimport java.util.concurrent.Executors;\n\nExecutor executor = Executors.newFixedThreadPool(4);\nclient.async().chat().completions().createStreaming(params).subscribe(\n    chunk -\u003e System.out.println(chunk), executor\n);\n```\n\nOr configure the client globally using the `streamHandlerExecutor` method:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport java.util.concurrent.Executors;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    .fromEnv()\n    .streamHandlerExecutor(Executors.newFixedThreadPool(4))\n    .build();\n```\n\n### Streaming helpers\n\nThe SDK provides conveniences for streamed chat completions. A\n[`ChatCompletionAccumulator`](openai-java-core/src/main/kotlin/com/openai/helpers/ChatCompletionAccumulator.kt)\ncan record the stream of chat completion chunks in the response as they are processed and accumulate\na [`ChatCompletion`](openai-java-core/src/main/kotlin/com/openai/models/chat/completions/ChatCompletion.kt)\nobject similar to that which would have been returned by the non-streaming API.\n\nFor a synchronous response add a\n[`Stream.peek()`](https://docs.oracle.com/javase/8/docs/api/java/util/stream/Stream.html#peek-java.util.function.Consumer-)\ncall to the stream pipeline to accumulate each chunk:\n\n```java\nimport com.openai.core.http.StreamResponse;\nimport com.openai.helpers.ChatCompletionAccumulator;\nimport com.openai.models.chat.completions.ChatCompletion;\nimport com.openai.models.chat.completions.ChatCompletionChunk;\n\nChatCompletionAccumulator chatCompletionAccumulator = ChatCompletionAccumulator.create();\n\ntry (StreamResponse\u003cChatCompletionChunk\u003e streamResponse =\n        client.chat().completions().createStreaming(createParams)) {\n    streamResponse.stream()\n            .peek(chatCompletionAccumulator::accumulate)\n            .flatMap(completion -\u003e completion.choices().stream())\n            .flatMap(choice -\u003e choice.delta().content().stream())\n            .forEach(System.out::print);\n}\n\nChatCompletion chatCompletion = chatCompletionAccumulator.chatCompletion();\n```\n\nFor an asynchronous response, add the `ChatCompletionAccumulator` to the `subscribe()` call:\n\n```java\nimport com.openai.helpers.ChatCompletionAccumulator;\nimport com.openai.models.chat.completions.ChatCompletion;\n\nChatCompletionAccumulator chatCompletionAccumulator = ChatCompletionAccumulator.create();\n\nclient.chat()\n        .completions()\n        .createStreaming(createParams)\n        .subscribe(chunk -\u003e chatCompletionAccumulator.accumulate(chunk).choices().stream()\n                .flatMap(choice -\u003e choice.delta().content().stream())\n                .forEach(System.out::print))\n        .onCompleteFuture()\n        .join();\n\nChatCompletion chatCompletion = chatCompletionAccumulator.chatCompletion();\n```\n\nThe SDK provides conveniences for streamed responses. A\n[`ResponseAccumulator`](openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt)\ncan record the stream of response events as they are processed and accumulate a\n[`Response`](openai-java-core/src/main/kotlin/com/openai/models/responses/Response.kt)\nobject similar to that which would have been returned by the non-streaming API.\n\nFor a synchronous response add a\n[`Stream.peek()`](https://docs.oracle.com/javase/8/docs/api/java/util/stream/Stream.html#peek-java.util.function.Consumer-)\ncall to the stream pipeline to accumulate each event:\n\n```java\nimport com.openai.core.http.StreamResponse;\nimport com.openai.helpers.ResponseAccumulator;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseStreamEvent;\n\nResponseAccumulator responseAccumulator = ResponseAccumulator.create();\n\ntry (StreamResponse\u003cResponseStreamEvent\u003e streamResponse =\n        client.responses().createStreaming(createParams)) {\n    streamResponse.stream()\n            .peek(responseAccumulator::accumulate)\n            .flatMap(event -\u003e event.outputTextDelta().stream())\n            .forEach(textEvent -\u003e System.out.print(textEvent.delta()));\n}\n\nResponse response = responseAccumulator.response();\n```\n\nFor an asynchronous response, add the `ResponseAccumulator` to the `subscribe()` call:\n\n```java\nimport com.openai.helpers.ResponseAccumulator;\nimport com.openai.models.responses.Response;\n\nResponseAccumulator responseAccumulator = ResponseAccumulator.create();\n\nclient.responses()\n        .createStreaming(createParams)\n        .subscribe(event -\u003e responseAccumulator.accumulate(event)\n                .outputTextDelta().ifPresent(textEvent -\u003e System.out.print(textEvent.delta())))\n        .onCompleteFuture()\n        .join();\n\nResponse response = responseAccumulator.response();\n```\n\n## Structured outputs with JSON schemas\n\nOpen AI [Structured Outputs](https://platform.openai.com/docs/guides/structured-outputs?api-mode=chat)\nis a feature that ensures that the model will always generate responses that adhere to a supplied\n[JSON schema](https://json-schema.org/overview/what-is-jsonschema).\n\nA JSON schema can be defined by creating a\n[`ResponseFormatJsonSchema`](openai-java-core/src/main/kotlin/com/openai/models/ResponseFormatJsonSchema.kt)\nand setting it on the input parameters. However, for greater convenience, a JSON schema can instead\nbe derived automatically from the structure of an arbitrary Java class. The JSON content from the\nresponse will then be converted automatically to an instance of that Java class. A full, working\nexample of the use of Structured Outputs with arbitrary Java classes can be seen in\n[`StructuredOutputsExample`](openai-java-example/src/main/java/com/openai/example/StructuredOutputsExample.java).\n\nJava classes can contain fields declared to be instances of other classes and can use collections\n(see [Defining JSON schema properties](#defining-json-schema-properties) for more details):\n\n```java\nclass Person {\n    public String name;\n    public int birthYear;\n}\n\nclass Book {\n    public String title;\n    public Person author;\n    public int publicationYear;\n}\n\nclass BookList {\n    public List\u003cBook\u003e books;\n}\n```\n\nPass the top-level class—`BookList` in this example—to `responseFormat(Class\u003cT\u003e)` when building the\nparameters and then access an instance of `BookList` from the generated message content in the\nresponse:\n\n```java\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\nimport com.openai.models.chat.completions.StructuredChatCompletionCreateParams;\n\nStructuredChatCompletionCreateParams\u003cBookList\u003e params = ChatCompletionCreateParams.builder()\n        .addUserMessage(\"List some famous late twentieth century novels.\")\n        .model(ChatModel.GPT_5_2)\n        .responseFormat(BookList.class)\n        .build();\n\nclient.chat().completions().create(params).choices().stream()\n        .flatMap(choice -\u003e choice.message().content().stream())\n        .flatMap(bookList -\u003e bookList.books.stream())\n        .forEach(book -\u003e System.out.println(book.title + \" by \" + book.author.name));\n```\n\nYou can start building the parameters with an instance of\n[`ChatCompletionCreateParams.Builder`](openai-java-core/src/main/kotlin/com/openai/models/chat/completions/ChatCompletionCreateParams.kt)\nor\n[`StructuredChatCompletionCreateParams.Builder`](openai-java-core/src/main/kotlin/com/openai/models/chat/completions/StructuredChatCompletionCreateParams.kt).\nIf you start with the former (which allows for more compact code) the builder type will change to\nthe latter when `ChatCompletionCreateParams.Builder.responseFormat(Class\u003cT\u003e)` is called.\n\nIf a field in a class is optional and does not require a defined value, you can represent this using\nthe [`java.util.Optional`](https://docs.oracle.com/javase/8/docs/api/java/util/Optional.html) class.\nIt is up to the AI model to decide whether to provide a value for that field or leave it empty.\n\n```java\nimport java.util.Optional;\n\nclass Book {\n    public String title;\n    public Person author;\n    public int publicationYear;\n    public Optional\u003cString\u003e isbn;\n}\n```\n\nGeneric type information for fields is retained in the class's metadata, but _generic type erasure_\napplies in other scopes. While, for example, a JSON schema defining an array of books can be derived\nfrom the `BookList.books` field with type `List\u003cBook\u003e`, a valid JSON schema cannot be derived from a\nlocal variable of that same type, so the following will _not_ work:\n\n```java\nList\u003cBook\u003e books = new ArrayList\u003c\u003e();\n\nStructuredChatCompletionCreateParams\u003cList\u003cBook\u003e\u003e params = ChatCompletionCreateParams.builder()\n        .responseFormat(books.getClass())\n        // ...\n        .build();\n```\n\nIf an error occurs while converting a JSON response to an instance of a Java class, the error\nmessage will include the JSON response to assist in diagnosis. For instance, if the response is\ntruncated, the JSON data will be incomplete and cannot be converted to a class instance. If your\nJSON response may contain sensitive information, avoid logging it directly, or ensure that you\nredact any sensitive details from the error message.\n\n### Local JSON schema validation\n\nStructured Outputs supports a\n[subset](https://platform.openai.com/docs/guides/structured-outputs#supported-schemas) of the JSON\nSchema language. Schemas are generated automatically from classes to align with this subset.\nHowever, due to the inherent structure of the classes, the generated schema may still violate\ncertain OpenAI schema restrictions, such as exceeding the maximum nesting depth or utilizing\nunsupported data types.\n\nTo facilitate compliance, the method `responseFormat(Class\u003cT\u003e)` performs a validation check on the\nschema derived from the specified class. This validation ensures that all restrictions are adhered\nto. If any issues are detected, an exception will be thrown, providing a detailed message outlining\nthe reasons for the validation failure.\n\n- **Local Validation**: The validation process occurs locally, meaning no requests are sent to the\n  remote AI model. If the schema passes local validation, it is likely to pass remote validation as\n  well.\n- **Remote Validation**: The remote AI model will conduct its own validation upon receiving the JSON\n  schema in the request.\n- **Version Compatibility**: There may be instances where local validation fails while remote\n  validation succeeds. This can occur if the SDK version is outdated compared to the restrictions\n  enforced by the remote AI model.\n- **Disabling Local Validation**: If you encounter compatibility issues and wish to bypass local\n  validation, you can disable it by passing\n  [`JsonSchemaLocalValidation.NO`](openai-java-core/src/main/kotlin/com/openai/core/JsonSchemaLocalValidation.kt)\n  to the `responseFormat(Class\u003cT\u003e, JsonSchemaLocalValidation)` method when building the parameters.\n  (The default value for this parameter is `JsonSchemaLocalValidation.YES`.)\n\n```java\nimport com.openai.core.JsonSchemaLocalValidation;\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\nimport com.openai.models.chat.completions.StructuredChatCompletionCreateParams;\n\nStructuredChatCompletionCreateParams\u003cBookList\u003e params = ChatCompletionCreateParams.builder()\n        .addUserMessage(\"List some famous late twentieth century novels.\")\n        .model(ChatModel.GPT_5_2)\n        .responseFormat(BookList.class, JsonSchemaLocalValidation.NO)\n        .build();\n```\n\nBy following these guidelines, you can ensure that your structured outputs conform to the necessary\nschema requirements and minimize the risk of remote validation errors.\n\n### Usage with the Responses API\n\n_Structured Outputs_ are also supported for the Responses API. The usage is the same as described\nexcept where the Responses API differs slightly from the Chat Completions API. Pass the top-level\nclass to `text(Class\u003cT\u003e)` when building the parameters and then access an instance of the class from\nthe generated message content in the response.\n\nYou can start building the parameters with an instance of\n[`ResponseCreateParams.Builder`](openai-java-core/src/main/kotlin/com/openai/models/responses/ResponseCreateParams.kt)\nor\n[`StructuredResponseCreateParams.Builder`](openai-java-core/src/main/kotlin/com/openai/models/responses/StructuredResponseCreateParams.kt).\nIf you start with the former (which allows for more compact code) the builder type will change to\nthe latter when `ResponseCreateParams.Builder.text(Class\u003cT\u003e)` is called.\n\nFor a full example of the usage of _Structured Outputs_ with the Responses API, see\n[`ResponsesStructuredOutputsExample`](openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsExample.java).\n\nInstead of using `ResponseCreateParams.text(Class\u003cT\u003e)`, you can build a\n[`StructuredResponseTextConfig`](openai-java-core/src/main/kotlin/com/openai/models/responses/StructuredResponseTextConfig.kt)\nand set it on the `ResponseCreateParams` using the `text(StructuredResponseTextConfig)` method.\nSimilar to using `ResponseCreateParams`, you can start with a `ResponseTextConfig.Builder` and its\n`format(Class\u003cT\u003e)` method will change it to a `StructuredResponseTextConfig.Builder`. This also\nallows you to set the `verbosity` configuration parameter on the text configuration before adding it\nto the `ResponseCreateParams`.\n\nFor a full example of the usage of _Structured Outputs_ with the `ResponseTextConfig` and its\n`verbosity` parameter, see\n[`ResponsesStructuredOutputsVerbosityExample`](openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsVerbosityExample.java).\n\n### Usage with streaming\n\n_Structured Outputs_ can also be used with [Streaming](#streaming) and the Chat Completions API. As\nresponses are returned in \"chunks\", the full response must first be accumulated to concatenate the\nJSON strings that can then be converted into instances of the arbitrary Java class. Normal streaming\noperations can be performed while accumulating the JSON strings.\n\nUse the [`ChatCompletionAccumulator`](openai-java-core/src/main/kotlin/com/openai/helpers/ChatCompletionAccumulator.kt)\nas described in the section on [Streaming helpers](#streaming-helpers) to accumulate the JSON\nstrings. Once accumulated, use `ChatCompletionAccumulator.chatCompletion(Class\u003cT\u003e)` to convert the\naccumulated `ChatCompletion` into a\n[`StructuredChatCompletion`](openai-java-core/src/main/kotlin/com/openai/models/chat/completions/StructuredChatCompletion.kt).\nThe `StructuredChatCompletion` can then automatically deserialize the JSON strings into instances of\nyour Java class.\n\nFor a full example of the usage of _Structured Outputs_ with Streaming and the Chat Completions API,\nsee\n[`StructuredOutputsStreamingExample`](openai-java-example/src/main/java/com/openai/example/StructuredOutputsStreamingExample.java).\n\nWith the Responses API, accumulate events while streaming using the\n[`ResponseAccumulator`](openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt).\nOnce accumulated, use `ResponseAccumulator.response(Class\u003cT\u003e)` to convert the accumulated `Response`\ninto a\n[`StructuredResponse`](openai-java-core/src/main/kotlin/com/openai/models/responses/StructuredResponse.kt).\nThe [`StructuredResponse`] can then automatically deserialize the JSON strings into instances of\nyour Java class.\n\nFor a full example of the usage of _Structured Outputs_ with Streaming and the Responses API, see\n[`ResponsesStructuredOutputsStreamingExample`](openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsStreamingExample.java).\n\n### Defining JSON schema properties\n\nWhen a JSON schema is derived from your Java classes, all properties represented by `public` fields\nor `public` getter methods are included in the schema by default. Non-`public` fields and getter\nmethods are _not_ included by default. You can exclude `public`, or include non-`public` fields or\ngetter methods, by using the `@JsonIgnore` or `@JsonProperty` annotations respectively (see\n[Annotating classes and JSON schemas](#annotating-classes-and-json-schemas) for details).\n\nIf you do not want to define `public` fields, you can define `private` fields and corresponding\n`public` getter methods. For example, a `private` field `myValue` with a `public` getter method\n`getMyValue()` will result in a `\"myValue\"` property being included in the JSON schema. If you\nprefer not to use the conventional Java \"get\" prefix for the name of the getter method, then you\n_must_ annotate the getter method with the `@JsonProperty` annotation and the full method name will\nbe used as the property name. You do not have to define any corresponding setter methods if you do\nnot need them.\n\nEach of your classes _must_ define at least one property to be included in the JSON schema. A\nvalidation error will occur if any class contains no fields or getter methods from which schema\nproperties can be derived. This may occur if, for example:\n\n- There are no fields or getter methods in the class.\n- All fields and getter methods are `public`, but all are annotated with `@JsonIgnore`.\n- All fields and getter methods are non-`public`, but none are annotated with `@JsonProperty`.\n- A field or getter method is declared with a `Map` type. A `Map` is treated like a separate class\n  with no named properties, so it will result in an empty `\"properties\"` field in the JSON schema.\n\n### Annotating classes and JSON schemas\n\nYou can use annotations to add further information to the JSON schema derived from your Java\nclasses, or to control which fields or getter methods will be included in the schema. Details from\nannotations captured in the JSON schema may be used by the AI model to improve its response. The SDK\nsupports the use of [Jackson Databind](https://github.com/FasterXML/jackson-databind) annotations.\n\n```java\nimport com.fasterxml.jackson.annotation.JsonClassDescription;\nimport com.fasterxml.jackson.annotation.JsonIgnore;\nimport com.fasterxml.jackson.annotation.JsonPropertyDescription;\n\nclass Person {\n    @JsonPropertyDescription(\"The first name and surname of the person\")\n    public String name;\n    public int birthYear;\n    @JsonPropertyDescription(\"The year the person died, or 'present' if the person is living.\")\n    public String deathYear;\n}\n\n@JsonClassDescription(\"The details of one published book\")\nclass Book {\n    public String title;\n    public Person author;\n    @JsonPropertyDescription(\"The year in which the book was first published.\")\n    public int publicationYear;\n    @JsonIgnore public String genre;\n}\n\nclass BookList {\n    public List\u003cBook\u003e books;\n}\n```\n\n- Use `@JsonClassDescription` to add a detailed description to a class.\n- Use `@JsonPropertyDescription` to add a detailed description to a field or getter method of a\n  class.\n- Use `@JsonIgnore` to exclude a `public` field or getter method of a class from the generated JSON\n  schema.\n- Use `@JsonProperty` to include a non-`public` field or getter method of a class in the generated\n  JSON schema.\n\nIf you use `@JsonProperty(required = false)`, the `false` value will be ignored. OpenAI JSON schemas\nmust mark all properties as _required_, so the schema generated from your Java classes will respect\nthat restriction and ignore any annotation that would violate it.\n\nYou can also use [OpenAPI Swagger 2](https://swagger.io/specification/v2/)\n[`@Schema`](https://github.com/swagger-api/swagger-core/wiki/Swagger-2.X---Annotations#schema) and\n[`@ArraySchema`](https://github.com/swagger-api/swagger-core/wiki/Swagger-2.X---Annotations#arrayschema)\nannotations. These allow type-specific constraints to be added to your schema properties. You can\nlearn more about the supported constraints in the OpenAI documentation on\n[Supported properties](https://platform.openai.com/docs/guides/structured-outputs#supported-properties).\n\n```java\nimport io.swagger.v3.oas.annotations.media.Schema;\nimport io.swagger.v3.oas.annotations.media.ArraySchema;\n\nclass Article {\n    @ArraySchema(minItems = 1, maxItems = 10)\n    public List\u003cString\u003e authors;\n\n    @Schema(pattern = \"^[A-Za-z ]+$\")\n    public String title;\n\n    @Schema(format = \"date\")\n    public String publicationDate;\n\n    @Schema(minimum = \"1\")\n    public int pageCount;\n}\n```\n\nLocal validation will check that you have not used any unsupported constraint keywords. However, the\nvalues of the constraints are _not_ validated locally. For example, if you use a value for the\n`\"format\"` constraint of a string property that is not in the list of\n[supported format names](https://platform.openai.com/docs/guides/structured-outputs#supported-properties),\nthen local validation will pass, but the AI model may report an error.\n\nIf you use both Jackson and Swagger annotations to set the same schema field, the Jackson annotation\nwill take precedence. In the following example, the description of `myProperty` will be set to\n\"Jackson description\"; \"Swagger description\" will be ignored:\n\n```java\nimport com.fasterxml.jackson.annotation.JsonPropertyDescription;\nimport io.swagger.v3.oas.annotations.media.Schema;\n\nclass MyObject {\n    @Schema(description = \"Swagger description\")\n    @JsonPropertyDescription(\"Jackson description\")\n    public String myProperty;\n}\n```\n\n## Function calling with JSON schemas\n\nOpenAI [Function Calling](https://platform.openai.com/docs/guides/function-calling?api-mode=chat)\nlets you integrate external functions directly into the language model's responses. Instead of\nproducing plain text, the model can output instructions (with parameters) for calling a function\nwhen appropriate. You define a [JSON schema](https://json-schema.org/overview/what-is-jsonschema)\nfor functions, and the model uses it to decide when and how to trigger these calls, enabling more\ninteractive, data-driven applications.\n\nA JSON schema describing a function's parameters can be defined via the API by building a\n[`ChatCompletionTool`](openai-java-core/src/main/kotlin/com/openai/models/chat/completions/ChatCompletionTool.kt)\ncontaining a\n[`FunctionDefinition`](openai-java-core/src/main/kotlin/com/openai/models/FunctionDefinition.kt)\nand then using `addTool` to set it on the input parameters. The response from the AI model may then\ncontain requests to call your functions, detailing the functions' names and their parameter values\nas JSON data that conforms to the JSON schema from the function definition. You can then parse the\nparameter values from this JSON, invoke your functions, and pass your functions' results back to the\nAI model. A full, working example of _Function Calling_ using the low-level API can be seen in\n[`FunctionCallingRawExample`](openai-java-example/src/main/java/com/openai/example/FunctionCallingRawExample.java).\n\nHowever, for greater convenience, the SDK can derive a function and its parameters automatically\nfrom the structure of an arbitrary Java class: the class's name provides the function name, and the\nclass's fields define the function's parameters. When the AI model responds with the parameter\nvalues in JSON form, you can then easily convert that JSON to an instance of your Java class and\nuse the parameter values to invoke your custom function. A full, working example of the use of\n_Function Calling_ with Java classes to define function parameters can be seen in\n[`FunctionCallingExample`](openai-java-example/src/main/java/com/openai/example/FunctionCallingExample.java).\n\nLike for [Structured Outputs](#structured-outputs-with-json-schemas), Java classes can contain\nfields declared to be instances of other classes and can use collections (see\n[Defining JSON schema properties](#defining-json-schema-properties) for more details). Optionally,\nannotations can be used to set the descriptions of the function (class) and its parameters (fields)\nto assist the AI model in understanding the purpose of the function and the possible values of its\nparameters.\n\n```java\nimport com.fasterxml.jackson.annotation.JsonClassDescription;\nimport com.fasterxml.jackson.annotation.JsonPropertyDescription;\n\n@JsonClassDescription(\"Gets the quality of the given SDK.\")\nstatic class GetSdkQuality {\n    @JsonPropertyDescription(\"The name of the SDK.\")\n    public String name;\n\n    public SdkQuality execute() {\n        return new SdkQuality(\n                name, name.contains(\"OpenAI\") ? \"It's robust and polished!\" : \"*shrug*\");\n    }\n}\n\nstatic class SdkQuality {\n    public String quality;\n\n    public SdkQuality(String name, String evaluation) {\n        quality = name + \": \" + evaluation;\n    }\n}\n\n@JsonClassDescription(\"Gets the review score (out of 10) for the named SDK.\")\nstatic class GetSdkScore {\n  public String name;\n\n  public int execute() {\n    return name.contains(\"OpenAI\") ? 10 : 3;\n  }\n}\n```\n\nWhen your functions are defined, add them to the input parameters using `addTool(Class\u003cT\u003e)` and then\ncall them if requested to do so in the AI model's response. `Function.argments(Class\u003cT\u003e)` can be\nused to parse a function's parameters in JSON form to an instance of your function-defining class.\nThe fields of that instance will be set to the values of the parameters to the function call.\n\nAfter calling the function, use `ChatCompletionToolMessageParam.Builder.contentAsJson(Object)` to\npass the function's result back to the AI model. The method will convert the result to JSON form\nfor consumption by the model. The `Object` can be any object, including simple `String` instances\nand boxed primitive types.\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.*;\nimport java.util.Collection;\n\nOpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\nChatCompletionCreateParams.Builder createParamsBuilder = ChatCompletionCreateParams.builder()\n        .model(ChatModel.GPT_3_5_TURBO)\n        .maxCompletionTokens(2048)\n        .addTool(GetSdkQuality.class)\n        .addTool(GetSdkScore.class)\n        .addUserMessage(\"How good are the following SDKs and what do reviewers say: \"\n                + \"OpenAI Java SDK, Unknown Company SDK.\");\n\nclient.chat().completions().create(createParamsBuilder.build()).choices().stream()\n        .map(ChatCompletion.Choice::message)\n        // Add each assistant message onto the builder so that we keep track of the\n        // conversation for asking a follow-up question later.\n        .peek(createParamsBuilder::addMessage)\n        .flatMap(message -\u003e {\n            message.content().ifPresent(System.out::println);\n            return message.toolCalls().stream().flatMap(Collection::stream);\n        })\n        .forEach(toolCall -\u003e {\n            Object result = callFunction(toolCall.function());\n            // Add the tool call result to the conversation.\n            createParamsBuilder.addMessage(ChatCompletionToolMessageParam.builder()\n                    .toolCallId(toolCall.id())\n                    .contentAsJson(result)\n                    .build());\n        });\n\n// Ask a follow-up question about the function call result.\ncreateParamsBuilder.addUserMessage(\"Why do you say that?\");\nclient.chat().completions().create(createParamsBuilder.build()).choices().stream()\n        .flatMap(choice -\u003e choice.message().content().stream())\n        .forEach(System.out::println);\n\nstatic Object callFunction(ChatCompletionMessageToolCall.Function function) {\n  switch (function.name()) {\n    case \"GetSdkQuality\":\n      return function.arguments(GetSdkQuality.class).execute();\n    case \"GetSdkScore\":\n      return function.arguments(GetSdkScore.class).execute();\n    default:\n      throw new IllegalArgumentException(\"Unknown function: \" + function.name());\n  }\n}\n```\n\nIn the code above, an `execute()` method encapsulates each function's logic. However, there is no\nrequirement to follow that pattern. You are free to implement your function's logic in any way that\nbest suits your use case. The pattern above is only intended to _suggest_ that a suitable pattern\nmay make the process of function calling simpler to understand and implement.\n\n### Usage with the Responses API\n\n_Function Calling_ is also supported for the Responses API. The usage is the same as described\nexcept where the Responses API differs slightly from the Chat Completions API. Pass the top-level\nclass to `addTool(Class\u003cT\u003e)` when building the parameters. In the response, look for\n[`RepoonseOutputItem`](openai-java-core/src/main/kotlin/com/openai/models/responses/ResponseOutputItem.kt)\ninstances that are function calls. Parse the parameters to each function call to an instance of the\nclass using\n[`ResponseFunctionToolCall.arguments(Class\u003cT\u003e)`](openai-java-core/src/main/kotlin/com/openai/models/responses/ResponseFunctionToolCall.kt).\nFinally, pass the result of each call back to the model.\n\nFor a full example of the usage of _Function Calling_ with the Responses API using the low-level\nAPI to define and parse function parameters, see\n[`ResponsesFunctionCallingRawExample`](openai-java-example/src/main/java/com/openai/example/ResponsesFunctionCallingRawExample.java).\n\nFor a full example of the usage of _Function Calling_ with the Responses API using Java classes to\ndefine and parse function parameters, see\n[`ResponsesFunctionCallingExample`](openai-java-example/src/main/java/com/openai/example/ResponsesFunctionCallingExample.java).\n\n### Local function JSON schema validation\n\nLike for _Structured Outputs_, you can perform local validation to check that the JSON schema\nderived from your function class respects the restrictions imposed by OpenAI on such schemas. Local\nvalidation is enabled by default, but it can be disabled by adding `JsonSchemaLocalValidation.NO` to\nthe call to `addTool`.\n\n```java\nChatCompletionCreateParams.Builder createParamsBuilder = ChatCompletionCreateParams.builder()\n        .model(ChatModel.GPT_3_5_TURBO)\n        .maxCompletionTokens(2048)\n        .addTool(GetSdkQuality.class, JsonSchemaLocalValidation.NO)\n        .addTool(GetSdkScore.class, JsonSchemaLocalValidation.NO)\n        .addUserMessage(\"How good are the following SDKs and what do reviewers say: \"\n                + \"OpenAI Java SDK, Unknown Company SDK.\");\n```\n\nSee [Local JSON schema validation](#local-json-schema-validation) for more details on local schema\nvalidation and under what circumstances you might want to disable it.\n\n### Annotating function classes\n\nYou can use annotations to add further information about functions to the JSON schemas that are\nderived from your function classes, or to control which fields or getter methods will be used as\nparameters to the function. Details from annotations captured in the JSON schema may be used by the\nAI model to improve its response. The SDK supports the use of\n[Jackson Databind](https://github.com/FasterXML/jackson-databind) annotations.\n\n- Use `@JsonClassDescription` to add a description to a function class detailing when and how to use\n  that function.\n- Use `@JsonTypeName` to set the function name to something other than the simple name of the class,\n  which is used by default.\n- Use `@JsonPropertyDescription` to add a detailed description to function parameter (a field or\n  getter method of a function class).\n- Use `@JsonIgnore` to exclude a `public` field or getter method of a class from the generated JSON\n  schema for a function's parameters.\n- Use `@JsonProperty` to include a non-`public` field or getter method of a class in the generated\n  JSON schema for a function's parameters.\n\nOpenAI provides some\n[Best practices for defining functions](https://platform.openai.com/docs/guides/function-calling#best-practices-for-defining-functions)\nthat may help you to understand how to use the above annotations effectively for your functions.\n\nSee also [Defining JSON schema properties](#defining-json-schema-properties) for more details on how\nto use fields and getter methods and combine access modifiers and annotations to define the\nparameters of your functions. The same rules apply to function classes and to the structured output\nclasses described in that section.\n\n## File uploads\n\nThe SDK defines methods that accept files.\n\nTo upload a file, pass a [`Path`](https://docs.oracle.com/javase/8/docs/api/java/nio/file/Path.html):\n\n```java\nimport com.openai.models.files.FileCreateParams;\nimport com.openai.models.files.FileObject;\nimport com.openai.models.files.FilePurpose;\nimport java.nio.file.Paths;\n\nFileCreateParams params = FileCreateParams.builder()\n    .purpose(FilePurpose.FINE_TUNE)\n    .file(Paths.get(\"input.jsonl\"))\n    .build();\nFileObject fileObject = client.files().create(params);\n```\n\nOr an arbitrary [`InputStream`](https://docs.oracle.com/javase/8/docs/api/java/io/InputStream.html):\n\n```java\nimport com.openai.models.files.FileCreateParams;\nimport com.openai.models.files.FileObject;\nimport com.openai.models.files.FilePurpose;\nimport java.net.URL;\n\nFileCreateParams params = FileCreateParams.builder()\n    .purpose(FilePurpose.FINE_TUNE)\n    .file(new URL(\"https://example.com/input.jsonl\").openStream())\n    .build();\nFileObject fileObject = client.files().create(params);\n```\n\nOr a `byte[]` array:\n\n```java\nimport com.openai.models.files.FileCreateParams;\nimport com.openai.models.files.FileObject;\nimport com.openai.models.files.FilePurpose;\n\nFileCreateParams params = FileCreateParams.builder()\n    .purpose(FilePurpose.FINE_TUNE)\n    .file(\"content\".getBytes())\n    .build();\nFileObject fileObject = client.files().create(params);\n```\n\nNote that when passing a non-`Path` its filename is unknown so it will not be included in the request. To manually set a filename, pass a [`MultipartField`](openai-java-core/src/main/kotlin/com/openai/core/Values.kt):\n\n```java\nimport com.openai.core.MultipartField;\nimport com.openai.models.files.FileCreateParams;\nimport com.openai.models.files.FileObject;\nimport com.openai.models.files.FilePurpose;\nimport java.io.InputStream;\nimport java.net.URL;\n\nFileCreateParams params = FileCreateParams.builder()\n    .purpose(FilePurpose.FINE_TUNE)\n    .file(MultipartField.\u003cInputStream\u003ebuilder()\n        .value(new URL(\"https://example.com/input.jsonl\").openStream())\n        .filename(\"input.jsonl\")\n        .build())\n    .build();\nFileObject fileObject = client.files().create(params);\n```\n\n## Webhook Verification\n\nVerifying webhook signatures is _optional but encouraged_.\n\nFor more information about webhooks, see [the API docs](https://platform.openai.com/docs/guides/webhooks).\n\n### Parsing webhook payloads\n\nFor most use cases, you will likely want to verify the webhook and parse the payload at the same time. To achieve this, we provide the method `client.webhooks().unwrap()`, which parses a webhook request and verifies that it was sent by OpenAI. This method will throw an exception if the signature is invalid.\n\nNote that the `body` parameter must be the raw JSON string sent from the server (do not parse it first). The `.unwrap()` method will parse this JSON for you into an event object after verifying the webhook was sent from OpenAI.\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.core.http.Headers;\nimport com.openai.models.webhooks.UnwrapWebhookEvent;\nimport java.util.Optional;\n\nOpenAIClient client = OpenAIOkHttpClient.fromEnv(); // OPENAI_WEBHOOK_SECRET env var used by default\n\npublic void handleWebhook(String body, Map\u003cString, String\u003e headers) {\n    try {\n        Headers headersList = Headers.builder()\n                .putAll(headers)\n                .build();\n\n        UnwrapWebhookEvent event = client.webhooks().unwrap(body, headersList, Optional.empty());\n\n        if (event.isResponseCompletedWebhookEvent()) {\n            System.out.println(\"Response completed: \" + event.asResponseCompletedWebhookEvent().data());\n        } else if (event.isResponseFailed()) {\n            System.out.println(\"Response failed: \" + event.asResponseFailed().data());\n        } else {\n            System.out.println(\"Unhandled event type: \" + event.getClass().getSimpleName());\n        }\n    } catch (Exception e) {\n        System.err.println(\"Invalid webhook signature: \" + e.getMessage());\n        // Handle invalid signature\n    }\n}\n```\n\n### Verifying webhook payloads directly\n\nIn some cases, you may want to verify the webhook separately from parsing the payload. If you prefer to handle these steps separately, we provide the method `client.webhooks().verifySignature()` to _only verify_ the signature of a webhook request. Like `.unwrap()`, this method will throw an exception if the signature is invalid.\n\nNote that the `body` parameter must be the raw JSON string sent from the server (do not parse it first). You will then need to parse the body after verifying the signature.\n\n```java\nimport com.fasterxml.jackson.databind.ObjectMapper;\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.core.http.Headers;\nimport com.openai.models.webhooks.WebhookVerificationParams;\nimport java.util.Optional;\n\nOpenAIClient client = OpenAIOkHttpClient.fromEnv(); // OPENAI_WEBHOOK_SECRET env var used by default\nObjectMapper objectMapper = new ObjectMapper();\n\npublic void handleWebhook(String body, Map\u003cString, String\u003e headers) {\n    try {\n        Headers headersList = Headers.builder()\n                .putAll(headers)\n                .build();\n\n        client.webhooks().verifySignature(\n            WebhookVerificationParams.builder()\n                .payload(body)\n                .headers(headersList)\n                .build()\n        );\n\n        // Parse the body after verification\n        Map\u003cString, Object\u003e event = objectMapper.readValue(body, Map.class);\n        System.out.println(\"Verified event: \" + event);\n    } catch (Exception e) {\n        System.err.println(\"Invalid webhook signature: \" + e.getMessage());\n        // Handle invalid signature\n    }\n}\n```\n\n## Binary responses\n\nThe SDK defines methods that return binary responses, which are used for API responses that shouldn't necessarily be parsed, like non-JSON data.\n\nThese methods return [`HttpResponse`](openai-java-core/src/main/kotlin/com/openai/core/http/HttpResponse.kt):\n\n```java\nimport com.openai.core.http.HttpResponse;\nimport com.openai.models.files.FileContentParams;\n\nHttpResponse response = client.files().content(\"file_id\");\n```\n\nTo save the response content to a file, use the [`Files.copy(...)`](https://docs.oracle.com/javase/8/docs/api/java/nio/file/Files.html#copy-java.io.InputStream-java.nio.file.Path-java.nio.file.CopyOption...-) method:\n\n```java\nimport com.openai.core.http.HttpResponse;\nimport java.nio.file.Files;\nimport java.nio.file.Paths;\nimport java.nio.file.StandardCopyOption;\n\ntry (HttpResponse response = client.files().content(params)) {\n    Files.copy(\n        response.body(),\n        Paths.get(path),\n        StandardCopyOption.REPLACE_EXISTING\n    );\n} catch (Exception e) {\n    System.out.println(\"Something went wrong!\");\n    throw new RuntimeException(e);\n}\n```\n\nOr transfer the response content to any [`OutputStream`](https://docs.oracle.com/javase/8/docs/api/java/io/OutputStream.html):\n\n```java\nimport com.openai.core.http.HttpResponse;\nimport java.nio.file.Files;\nimport java.nio.file.Paths;\n\ntry (HttpResponse response = client.files().content(params)) {\n    response.body().transferTo(Files.newOutputStream(Paths.get(path)));\n} catch (Exception e) {\n    System.out.println(\"Something went wrong!\");\n    throw new RuntimeException(e);\n}\n```\n\n## Raw responses\n\nThe SDK defines methods that deserialize responses into instances of Java classes. However, these methods don't provide access to the response headers, status code, or the raw response body.\n\nTo access this data, prefix any HTTP method call on a client or service with `withRawResponse()`:\n\n```java\nimport com.openai.core.http.Headers;\nimport com.openai.core.http.HttpResponseFor;\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.ChatCompletion;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\n\nChatCompletionCreateParams params = ChatCompletionCreateParams.builder()\n    .addUserMessage(\"Say this is a test\")\n    .model(ChatModel.GPT_5_2)\n    .build();\nHttpResponseFor\u003cChatCompletion\u003e chatCompletion = client.chat().completions().withRawResponse().create(params);\n\nint statusCode = chatCompletion.statusCode();\nHeaders headers = chatCompletion.headers();\n```\n\nYou can still deserialize the response into an instance of a Java class if needed:\n\n```java\nimport com.openai.models.chat.completions.ChatCompletion;\n\nChatCompletion parsedChatCompletion = chatCompletion.parse();\n```\n\n### Request IDs\n\n\u003e For more information on debugging requests, see [the API docs](https://platform.openai.com/docs/api-reference/debugging-requests).\n\nWhen using raw responses, you can access the `x-request-id` response header using the `requestId()` method:\n\n```java\nimport com.openai.core.http.HttpResponseFor;\nimport com.openai.models.chat.completions.ChatCompletion;\nimport java.util.Optional;\n\nHttpResponseFor\u003cChatCompletion\u003e chatCompletion = client.chat().completions().withRawResponse().create(params);\nOptional\u003cString\u003e requestId = chatCompletion.requestId();\n```\n\nThis can be used to quickly log failing requests and report them back to OpenAI.\n\n## Error handling\n\nThe SDK throws custom unchecked exception types:\n\n- [`OpenAIServiceException`](openai-java-core/src/main/kotlin/com/openai/errors/OpenAIServiceException.kt): Base class for HTTP errors. See this table for which exception subclass is thrown for each HTTP status code:\n\n  | Status | Exception                                                                                                              |\n  | ------ | ---------------------------------------------------------------------------------------------------------------------- |\n  | 400    | [`BadRequestException`](openai-java-core/src/main/kotlin/com/openai/errors/BadRequestException.kt)                     |\n  | 401    | [`UnauthorizedException`](openai-java-core/src/main/kotlin/com/openai/errors/UnauthorizedException.kt)                 |\n  | 403    | [`PermissionDeniedException`](openai-java-core/src/main/kotlin/com/openai/errors/PermissionDeniedException.kt)         |\n  | 404    | [`NotFoundException`](openai-java-core/src/main/kotlin/com/openai/errors/NotFoundException.kt)                         |\n  | 422    | [`UnprocessableEntityException`](openai-java-core/src/main/kotlin/com/openai/errors/UnprocessableEntityException.kt)   |\n  | 429    | [`RateLimitException`](openai-java-core/src/main/kotlin/com/openai/errors/RateLimitException.kt)                       |\n  | 5xx    | [`InternalServerException`](openai-java-core/src/main/kotlin/com/openai/errors/InternalServerException.kt)             |\n  | others | [`UnexpectedStatusCodeException`](openai-java-core/src/main/kotlin/com/openai/errors/UnexpectedStatusCodeException.kt) |\n\n  [`SseException`](openai-java-core/src/main/kotlin/com/openai/errors/SseException.kt) is thrown for errors encountered during [SSE streaming](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events) after a successful initial HTTP response.\n\n- [`OpenAIIoException`](openai-java-core/src/main/kotlin/com/openai/errors/OpenAIIoException.kt): I/O networking errors.\n\n- [`OpenAIRetryableException`](openai-java-core/src/main/kotlin/com/openai/errors/OpenAIRetryableException.kt): Generic error indicating a failure that could be retried by the client.\n\n- [`OpenAIInvalidDataException`](openai-java-core/src/main/kotlin/com/openai/errors/OpenAIInvalidDataException.kt): Failure to interpret successfully parsed data. For example, when accessing a property that's supposed to be required, but the API unexpectedly omitted it from the response.\n\n- [`OpenAIException`](openai-java-core/src/main/kotlin/com/openai/errors/OpenAIException.kt): Base class for all exceptions. Most errors will result in one of the previously mentioned ones, but completely generic errors may be thrown using the base class.\n\n## Pagination\n\nThe SDK defines methods that return a paginated lists of results. It provides convenient ways to access the results either one page at a time or item-by-item across all pages.\n\n### Auto-pagination\n\nTo iterate through all results across all pages, use the `autoPager()` method, which automatically fetches more pages as needed.\n\nWhen using the synchronous client, the method returns an [`Iterable`](https://docs.oracle.com/javase/8/docs/api/java/lang/Iterable.html)\n\n```java\nimport com.openai.models.finetuning.jobs.FineTuningJob;\nimport com.openai.models.finetuning.jobs.JobListPage;\n\nJobListPage page = client.fineTuning().jobs().list();\n\n// Process as an Iterable\nfor (FineTuningJob job : page.autoPager()) {\n    System.out.println(job);\n}\n\n// Process as a Stream\npage.autoPager()\n    .stream()\n    .limit(50)\n    .forEach(job -\u003e System.out.println(job));\n```\n\nWhen using the asynchronous client, the method returns an [`AsyncStreamResponse`](openai-java-core/src/main/kotlin/com/openai/core/http/AsyncStreamResponse.kt):\n\n```java\nimport com.openai.core.http.AsyncStreamResponse;\nimport com.openai.models.finetuning.jobs.FineTuningJob;\nimport com.openai.models.finetuning.jobs.JobListPageAsync;\nimport java.util.Optional;\nimport java.util.concurrent.CompletableFuture;\n\nCompletableFuture\u003cJobListPageAsync\u003e pageFuture = client.async().fineTuning().jobs().list();\n\npageFuture.thenRun(page -\u003e page.autoPager().subscribe(job -\u003e {\n    System.out.println(job);\n}));\n\n// If you need to handle errors or completion of the stream\npageFuture.thenRun(page -\u003e page.autoPager().subscribe(new AsyncStreamResponse.Handler\u003c\u003e() {\n    @Override\n    public void onNext(FineTuningJob job) {\n        System.out.println(job);\n    }\n\n    @Override\n    public void onComplete(Optional\u003cThrowable\u003e error) {\n        if (error.isPresent()) {\n            System.out.println(\"Something went wrong!\");\n            throw new RuntimeException(error.get());\n        } else {\n            System.out.println(\"No more!\");\n        }\n    }\n}));\n\n// Or use futures\npageFuture.thenRun(page -\u003e page.autoPager()\n    .subscribe(job -\u003e {\n        System.out.println(job);\n    })\n    .onCompleteFuture()\n    .whenComplete((unused, error) -\u003e {\n        if (error != null) {\n            System.out.println(\"Something went wrong!\");\n            throw new RuntimeException(error);\n        } else {\n            System.out.println(\"No more!\");\n        }\n    }));\n```\n\n### Manual pagination\n\nTo access individual page items and manually request the next page, use the `items()`,\n`hasNextPage()`, and `nextPage()` methods:\n\n```java\nimport com.openai.models.finetuning.jobs.FineTuningJob;\nimport com.openai.models.finetuning.jobs.JobListPage;\n\nJobListPage page = client.fineTuning().jobs().list();\nwhile (true) {\n    for (FineTuningJob job : page.items()) {\n        System.out.println(job);\n    }\n\n    if (!page.hasNextPage()) {\n        break;\n    }\n\n    page = page.nextPage();\n}\n```\n\n## Logging\n\nThe SDK uses the standard [OkHttp logging interceptor](https://github.com/square/okhttp/tree/master/okhttp-logging-interceptor).\n\nEnable logging by setting the `OPENAI_LOG` environment variable to `info`:\n\n```sh\nexport OPENAI_LOG=info\n```\n\nOr to `debug` for more verbose logging:\n\n```sh\nexport OPENAI_LOG=debug\n```\n\n## ProGuard and R8\n\nAlthough the SDK uses reflection, it is still usable with [ProGuard](https://github.com/Guardsquare/proguard) and [R8](https://developer.android.com/topic/performance/app-optimization/enable-app-optimization) because `openai-java-core` is published with a [configuration file](openai-java-core/src/main/resources/META-INF/proguard/openai-java-core.pro) containing [keep rules](https://www.guardsquare.com/manual/configuration/usage).\n\nProGuard and R8 should automatically detect and use the published rules, but you can also manually copy the keep rules if necessary.\n\n## GraalVM\n\nAlthough the SDK uses reflection, it is still usable in [GraalVM](https://www.graalvm.org) because `openai-java-core` is published with [reachability metadata](https://www.graalvm.org/latest/reference-manual/native-image/metadata/).\n\nGraalVM should automatically detect and use the published metadata, but [manual configuration](https://www.graalvm.org/jdk24/reference-manual/native-image/overview/BuildConfiguration/) is also available.\n\n## Spring Boot\n\nIf you're using Spring Boot, then you can use the SDK's [Spring Boot starter](https://docs.spring.io/spring-boot/docs/2.7.18/reference/htmlsingle/#using.build-systems.starters) to simplify configuration and get set up quickly.\n\n### Installation\n\n\u003c!-- x-release-please-start-version --\u003e\n\n#### Gradle\n\n```kotlin\nimplementation(\"com.openai:openai-java-spring-boot-starter:4.27.0\")\n```\n\n#### Maven\n\n```xml\n\u003cdependency\u003e\n  \u003cgroupId\u003ecom.openai\u003c/groupId\u003e\n  \u003cartifactId\u003eopenai-java-spring-boot-starter\u003c/artifactId\u003e\n  \u003cversion\u003e4.27.0\u003c/version\u003e\n\u003c/dependency\u003e\n```\n\n\u003c!-- x-release-please-end --\u003e\n\n### Configuration\n\nThe [client's environment variable options](#client-configuration) can be configured in [`application.properties` or `application.yml`](https://docs.spring.io/spring-boot/how-to/properties-and-configuration.html).\n\n#### `application.properties`\n\n```properties\nopenai.base-url=https://api.openai.com/v1\nopenai.api-key=My API Key\nopenai.org-id=My Organization\nopenai.project-id=My Project\nopenai.webhook-secret=My Webhook Secret\n```\n\n#### `application.yml`\n\n```yaml\nopenai:\n  base-url: https://api.openai.com/v1\n  api-key: My API Key\n  org-id: My Organization\n  project-id: My Project\n  webhook-secret: My Webhook Secret\n```\n\n#### Other configuration\n\nConfigure any other client option by providing one or more instances of [`OpenAIClientCustomizer`](openai-java-spring-boot-starter/src/main/kotlin/com/openai/springboot/OpenAIClientCustomizer.kt). For example, here's how you'd set [`maxRetries`](#retries):\n\n```java\nimport com.openai.springboot.OpenAIClientCustomizer;\nimport org.springframework.context.annotation.Bean;\nimport org.springframework.context.annotation.Configuration;\n\n@Configuration\npublic class OpenAIConfig {\n    @Bean\n    public OpenAIClientCustomizer customizer() {\n        return builder -\u003e builder.maxRetries(3);\n    }\n}\n```\n\n### Usage\n\n[Inject](https://docs.spring.io/spring-framework/reference/core/beans/dependencies/factory-collaborators.html) [`OpenAIClient`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClient.kt) anywhere and start using it!\n\n## Jackson\n\nThe SDK depends on [Jackson](https://github.com/FasterXML/jackson) for JSON serialization/deserialization. It is compatible with version 2.13.4 or higher, but depends on version 2.18.2 by default.\n\nThe SDK throws an exception if it detects an incompatible Jackson version at runtime (e.g. if the default version was overridden in your Maven or Gradle config).\n\nIf the SDK threw an exception, but you're _certain_ the version is compatible, then disable the version check using the `checkJacksonVersionCompatibility` on [`OpenAIOkHttpClient`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClient.kt) or [`OpenAIOkHttpClientAsync`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClientAsync.kt).\n\n\u003e [!CAUTION]\n\u003e We make no guarantee that the SDK works correctly when the Jackson version check is disabled.\n\nAlso note that there are bugs in older Jackson versions that can affect the SDK. We don't work around all Jackson bugs ([example](https://github.com/FasterXML/jackson-databind/issues/3240)) and expect users to upgrade Jackson for those instead.\n\n## Microsoft Azure\n\nTo use this library with [Azure OpenAI](https://learn.microsoft.com/azure/ai-services/openai/overview), use the same\nOpenAI client builder but with the Azure-specific configuration.\n\n```java\nOpenAIClient client = OpenAIOkHttpClient.builder()\n        // Gets the API key and endpoint from the `AZURE_OPENAI_KEY` and `OPENAI_BASE_URL` environment variables, respectively\n        .fromEnv()\n        // Set the Azure Entra ID\n        .credential(BearerTokenCredential.create(AuthenticationUtil.getBearerTokenSupplier(\n                new DefaultAzureCredentialBuilder().build(), \"https://cognitiveservices.azure.com/.default\")))\n        .build();\n```\n\nSee the complete Azure OpenAI example in the [`openai-java-example`](openai-java-example/src/main/java/com/openai/example/AzureEntraIdExample.java) directory. The other examples in the directory also work with Azure as long as the client is configured to use it.\n\n### Optional: URL path mode configuration\n\nThe [`ClientOptions`](openai-java-core/src/main/kotlin/com/openai/core/ClientOptions.kt) can be configured to treat Azure OpenAI endpoint URLs differently, depending on your service setup. The default value is [`AzureUrlPathMode.AUTO`](openai-java-core/src/main/kotlin/com/openai/azure/AzureUrlPathMode.kt). To customize the SDK behavior, each value does the following:\n- `AzureUrlPathMode.LEGACY`: forces the deployment or model name into the path.\n- `AzureUrlPathMode.UNIFIED`: for newer endpoints ending in `/openai/v1` the service behaviour matches OpenAI's, therefore [`AzureOpenAIServiceVersion`](openai-java-core/src/main/kotlin/com/openai/azure/AzureOpenAIServiceVersion.kt) becomes optional and the model is passed in the request object.\n- `AzureUrlPathMode.AUTO`: automatically detects the path mode based on the base URL. Default value.\n\n## Network options\n\n### Retries\n\nThe SDK automatically retries 2 times by default, with a short exponential backoff between requests.\n\nOnly the following error types are retried:\n\n- Connection errors (for example, due to a network connectivity problem)\n- 408 Request Timeout\n- 409 Conflict\n- 429 Rate Limit\n- 5xx Internal\n\nThe API may also explicitly instruct the SDK to retry or not retry a request.\n\nTo set a custom number of retries, configure the client using the `maxRetries` method:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    .fromEnv()\n    .maxRetries(4)\n    .build();\n```\n\n### Timeouts\n\nRequests time out after 10 minutes by default.\n\nTo set a custom timeout, configure the method call using the `timeout` method:\n\n```java\nimport com.openai.models.chat.completions.ChatCompletion;\n\nChatCompletion chatCompletion = client.chat().completions().create(\n  params, RequestOptions.builder().timeout(Duration.ofSeconds(30)).build()\n);\n```\n\nOr configure the default for all method calls at the client level:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport java.time.Duration;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    .fromEnv()\n    .timeout(Duration.ofSeconds(30))\n    .build();\n```\n\n### Proxies\n\nTo route requests through a proxy, configure the client using the `proxy` method:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport java.net.InetSocketAddress;\nimport java.net.Proxy;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    .fromEnv()\n    .proxy(new Proxy(\n      Proxy.Type.HTTP, new InetSocketAddress(\n        \"https://example.com\", 8080\n      )\n    ))\n    .build();\n```\n\n### Connection pooling\n\nTo customize the underlying OkHttp connection pool, configure the client using the `maxIdleConnections` and `keepAliveDuration` methods:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport java.time.Duration;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    .fromEnv()\n    // If `maxIdleConnections` is set, then `keepAliveDuration` must be set, and vice versa.\n    .maxIdleConnections(10)\n    .keepAliveDuration(Duration.ofMinutes(2))\n    .build();\n```\n\nIf both options are unset, OkHttp's default connection pool settings are used.\n\n### HTTPS\n\n\u003e [!NOTE]\n\u003e Most applications should not call these methods, and instead use the system defaults. The defaults include\n\u003e special optimizations that can be lost if the implementations are modified.\n\nTo configure how HTTPS connections are secured, configure the client using the `sslSocketFactory`, `trustManager`, and `hostnameVerifier` methods:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    .fromEnv()\n    // If `sslSocketFactory` is set, then `trustManager` must be set, and vice versa.\n    .sslSocketFactory(yourSSLSocketFactory)\n    .trustManager(yourTrustManager)\n    .hostnameVerifier(yourHostnameVerifier)\n    .build();\n```\n\n### Custom HTTP client\n\nThe SDK consists of three artifacts:\n\n- `openai-java-core`\n  - Contains core SDK logic\n  - Does not depend on [OkHttp](https://square.github.io/okhttp)\n  - Exposes [`OpenAIClient`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClient.kt), [`OpenAIClientAsync`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsync.kt), [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt), and [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), all of which can work with any HTTP client\n- `openai-java-client-okhttp`\n  - Depends on [OkHttp](https://square.github.io/okhttp)\n  - Exposes [`OpenAIOkHttpClient`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClient.kt) and [`OpenAIOkHttpClientAsync`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClientAsync.kt), which provide a way to construct [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt) and [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), respectively, using OkHttp\n- `openai-java`\n  - Depends on and exposes the APIs of both `openai-java-core` and `openai-java-client-okhttp`\n  - Does not have its own logic\n\nThis structure allows replacing the SDK's default HTTP client without pulling in unnecessary dependencies.\n\n#### Customized [`OkHttpClient`](https://square.github.io/okhttp/3.x/okhttp/okhttp3/OkHttpClient.html)\n\n\u003e [!TIP]\n\u003e Try the available [network options](#network-options) before replacing the default client.\n\nTo use a customized `OkHttpClient`:\n\n1. Replace your [`openai-java` dependency](#installation) with `openai-java-core`\n2. Copy `openai-java-client-okhttp`'s [`OkHttpClient`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OkHttpClient.kt) class into your code and customize it\n3. Construct [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt) or [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), similarly to [`OpenAIOkHttpClient`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClient.kt) or [`OpenAIOkHttpClientAsync`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClientAsync.kt), using your customized client\n\n### Completely custom HTTP client\n\nTo use a completely custom HTTP client:\n\n1. Replace your [`openai-java` dependency](#installation) with `openai-java-core`\n2. Write a class that implements the [`HttpClient`](openai-java-core/src/main/kotlin/com/openai/core/http/HttpClient.kt) interface\n3. Construct [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt) or [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), similarly to [`OpenAIOkHttpClient`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClient.kt) or [`OpenAIOkHttpClientAsync`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClientAsync.kt), using your new client class\n\n## Undocumented API functionality\n\nThe SDK is typed for convenient usage of the documented API. However, it also supports working with undocumented or not yet supported parts of the API.\n\n### Parameters\n\nTo set undocumented parameters, call the `putAdditionalHeader`, `putAdditionalQueryParam`, or `putAdditionalBodyProperty` methods on any `Params` class:\n\n```java\nimport com.openai.core.JsonValue;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\n\nChatCompletionCreateParams params = ChatCompletionCreateParams.builder()\n    .putAdditionalHeader(\"Secret-Header\", \"42\")\n    .putAdditionalQueryParam(\"secret_query_param\", \"42\")\n    .putAdditionalBodyProperty(\"secretProperty\", JsonValue.from(\"42\"))\n    .build();\n```\n\nThese can be accessed on the built object later using the `_additionalHeaders()`, `_additionalQueryParams()`, and `_additionalBodyProperties()` methods.\n\nTo set undocumented parameters on _nested_ headers, query params, or body classes, call the `putAdditionalProperty` method on the nested class:\n\n```java\nimport com.openai.core.JsonValue;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\n\nChatCompletionCreateParams params = ChatCompletionCreateParams.builder()\n    .responseFormat(ChatCompletionCreateParams.ResponseFormat.builder()\n        .putAdditionalProperty(\"secretProperty\", JsonValue.from(\"42\"))\n        .build())\n    .build();\n```\n\nThese properties can be accessed on the nested built object later using the `_additionalProperties()` method.\n\nTo set a documented parameter or property to an undocumented or not yet supported _value_, pass a [`JsonValue`](openai-java-core/src/main/kotlin/com/openai/core/Values.kt) object to its setter:\n\n```java\nimport com.openai.core.JsonValue;\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\n\nChatCompletionCreateParams params = ChatCompletionCreateParams.builder()\n    .messages(JsonValue.from(42))\n    .model(ChatModel.GPT_5_2)\n    .build();\n```\n\nThe most straightforward way to create a [`JsonValue`](openai-java-core/src/main/kotlin/com/openai/core/Values.kt) is using its `from(...)` method:\n\n```java\nimport com.openai.core.JsonValue;\nimport java.util.List;\nimport java.util.Map;\n\n// Create primitive JSON values\nJsonValue nullValue = JsonValue.from(null);\nJsonValue booleanValue = JsonValue.from(true);\nJsonValue numberValue = JsonValue.from(42);\nJsonValue stringValue = JsonValue.from(\"Hello World!\");\n\n// Create a JSON array value equivalent to `[\"Hello\", \"World\"]`\nJsonValue arrayValue = JsonValue.from(List.of(\n  \"Hello\", \"World\"\n));\n\n// Create a JSON object value equivalent to `{ \"a\": 1, \"b\": 2 }`\nJsonValue objectValue = JsonValue.from(Map.of(\n  \"a\", 1,\n  \"b\", 2\n));\n\n// Create an arbitrarily nested JSON equivalent to:\n// {\n//   \"a\": [1, 2],\n//   \"b\": [3, 4]\n// }\nJsonValue complexValue = JsonValue.from(Map.of(\n  \"a\", List.of(\n    1, 2\n  ),\n  \"b\", List.of(\n    3, 4\n  )\n));\n```\n\nNormally a `Builder` class's `build` method will throw [`IllegalStateException`](https://docs.oracle.com/javase/8/docs/api/java/lang/IllegalStateException.html) if any required parameter or property is unset.\n\nTo forcibly omit a required parameter or property, pass [`JsonMissing`](openai-java-core/src/main/kotlin/com/openai/core/Values.kt):\n\n```java\nimport com.openai.core.JsonMissing;\nimport com.openai.models.ChatModel;\nimport com.openai.models.chat.completions.ChatCompletionCreateParams;\n\nChatCompletionCreateParams params = ChatCompletionCreateParams.builder()\n    .model(ChatModel.GPT_5_4)\n    .messages(JsonMissing.of())\n    .build();\n```\n\n### Response properties\n\nTo access undocumented response properties, call the `_additionalProperties()` method:\n\n```java\nimport com.openai.core.JsonValue;\nimport java.util.Map;\n\nMap\u003cString, JsonValue\u003e additionalProperties = client.chat().completions().create(params)._additionalProperties();\nJsonValue secretPropertyValue = additionalProperties.get(\"secretProperty\");\n\nString result = secretPropertyValue.accept(new JsonValue.Visitor\u003c\u003e() {\n    @Override\n    public String visitNull() {\n        return \"It's null!\";\n    }\n\n    @Override\n    public String visitBoolean(boolean value) {\n        return \"It's a boolean!\";\n    }\n\n    @Override\n    public String visitNumber(Number value) {\n        return \"It's a number!\";\n    }\n\n    // Other methods include `visitMissing`, `visitString`, `visitArray`, and `visitObject`\n    // The default implementation of each unimplemented method delegates to `visitDefault`, which throws by default, but can also be overridden\n});\n```\n\nTo access a property's raw JSON value, which may be undocumented, call its `_` prefixed method:\n\n```java\nimport com.openai.core.JsonField;\nimport com.openai.models.chat.completions.ChatCompletionMessageParam;\nimport java.util.Optional;\n\nJsonField\u003cList\u003cChatCompletionMessageParam\u003e\u003e messages = client.chat().completions().create(params)._messages();\n\nif (messages.isMissing()) {\n  // The property is absent from the JSON response\n} else if (messages.isNull()) {\n  // The property was set to literal null\n} else {\n  // Check if value was provided as a string\n  // Other methods include `asNumber()`, `asBoolean()`, etc.\n  Optional\u003cString\u003e jsonString = messages.asString();\n\n  // Try to deserialize into a custom type\n  MyClass myObject = messages.asUnknown().orElseThrow().convert(MyClass.class);\n}\n```\n\n### Response validation\n\nIn rare cases, the API may return a response that doesn't match the expected type. For example, the SDK may expect a property to contain a `String`, but the API could return something else.\n\nBy default, the SDK will not throw an exception in this case. It will throw [`OpenAIInvalidDataException`](openai-java-core/src/main/kotlin/com/openai/errors/OpenAIInvalidDataException.kt) only if you directly access the property.\n\nIf you would prefer to check that the response is completely well-typed upfront, then either call `validate()`:\n\n```java\nimport com.openai.models.chat.completions.ChatCompletion;\n\nChatCompletion chatCompletion = client.chat().completions().create(params).validate();\n```\n\nOr configure the method call to validate the response using the `responseValidation` method:\n\n```java\nimport com.openai.models.chat.completions.ChatCompletion;\n\nChatCompletion chatCompletion = client.chat().completions().create(\n  params, RequestOptions.builder().responseValidation(true).build()\n);\n```\n\nOr configure the default for all method calls at the client level:\n\n```java\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\n\nOpenAIClient client = OpenAIOkHttpClient.builder()\n    .fromEnv()\n    .responseValidation(true)\n    .build();\n```\n\n## FAQ\n\n### Why don't you use plain `enum` classes?\n\nJava `enum` classes are not trivially [forwards compatible](https://www.stainless.com/blog/making-java-enums-forwards-compatible). Using them in the SDK could cause runtime exceptions if the API is updated to respond with a new enum value.\n\n### Why do you represent fields using `JsonField\u003cT\u003e` instead of just plain `T`?\n\nUsing `JsonField\u003cT\u003e` enables a few features:\n\n- Allowing usage of [undocumented API functionality](#undocumented-api-functionality)\n- Lazily [validating the API response against the expected shape](#response-validation)\n- Representing absent vs explicitly null values\n\n### Why don't you use [`data` classes](https://kotlinlang.org/docs/data-classes.html)?\n\nIt is not [backwards compatible to add new fields to a data class](https://kotlinlang.org/docs/api-guidelines-backward-compatibility.html#avoid-using-data-classes-in-your-api) and we don't want to introduce a breaking change every time we add a field to a class.\n\n### Why don't you use checked exceptions?\n\nChecked exceptions are widely considered a mistake in the Java programming language. In fact, they were omitted from Kotlin for this reason.\n\nChecked exceptions:\n\n- Are verbose to handle\n- Encourage error handling at the wrong level of abstraction, where nothing can be done about the error\n- Are tedious to propagate due to the [function coloring problem](https://journal.stuffwithstuff.com/2015/02/01/what-color-is-your-function)\n- Don't play well with lambdas (also due to the function coloring problem)\n\n## Semantic versioning\n\nThis package generally follows [SemVer](https://semver.org/spec/v2.0.0.html) conventions, though certain backwards-incompatible changes may be released as minor versions:\n\n1. Changes to library internals which are technically public but not intended or documented for external use. _(Please open a GitHub issue to let us know if you are relying on such internals.)_\n2. Changes that we do not expect to impact the vast majority of users in practice.\n\nWe take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.\n\nWe are keen for your feedback; please open an [issue](https://www.github.com/openai/openai-java/issues) with questions, bugs, or suggestions.\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fopenai%2Fopenai-java","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fopenai%2Fopenai-java","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fopenai%2Fopenai-java/lists"}