{"id":17167954,"url":"https://github.com/akr4/openai-api","last_synced_at":"2025-03-24T18:23:10.219Z","repository":{"id":152517998,"uuid":"619343065","full_name":"akr4/openai-api","owner":"akr4","description":"Rust client for OpenAI API supporting streaming mode of the chat completion endpoint.","archived":false,"fork":false,"pushed_at":"2024-05-14T10:58:36.000Z","size":16,"stargazers_count":0,"open_issues_count":0,"forks_count":1,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-01-29T23:12:57.937Z","etag":null,"topics":["openai-api","rust"],"latest_commit_sha":null,"homepage":"","language":"Rust","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/akr4.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-03-27T00:08:07.000Z","updated_at":"2024-05-14T10:58:39.000Z","dependencies_parsed_at":null,"dependency_job_id":"9fa10fdb-acc8-4db6-b1e2-a9026799c887","html_url":"https://github.com/akr4/openai-api","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/akr4%2Fopenai-api","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/akr4%2Fopenai-api/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/akr4%2Fopenai-api/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/akr4%2Fopenai-api/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/akr4","download_url":"https://codeload.github.com/akr4/openai-api/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":245325781,"owners_count":20596914,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["openai-api","rust"],"created_at":"2024-10-14T23:10:35.331Z","updated_at":"2025-03-24T18:23:10.193Z","avatar_url":"https://github.com/akr4.png","language":"Rust","readme":"# openai-api\n\nRust client for OpenAI API supporting streaming mode of the chat completion endpoint.\n\n## Supported Endpoints\n\n- Completion: post https://api.openai.com/v1/completions\n- Chat Completion: post https://api.openai.com/v1/chat/completions (with streaming mode support)\n- Embeddings: post https://api.openai.com/v1/embeddings\n\n## Getting Started\n\nAdd the following to your `Cargo.toml`:\n\n```toml:file=Cargo.toml\nopenai-api = { git = \"https://github.com/akr4/openai-api.git\" }\n```\n\nPut this in your crate root:\n\n```rust\nuse std::env;\nuse openai_api::chat;\n\n#[tokio::main]\nasync fn main() {\n    // create request\n    let request = chat::CompletionRequest {\n        model: chat::Model::Gpt35Turbo,\n        temperature: Some(1.0),\n        messages: vec![\n            chat::Message {\n                role: chat::MessageRole::User,\n                content: \"Hello\".to_string(),\n            }\n        ],\n    };\n\n    // call completion endpoint\n    let response = chat::completion(\n        env::var(\"OPENAI_API_KEY\").expect(\"environment variable OPENAI_API_KEY is not found.\"),\n        \u0026request).await.unwrap();\n\n    // show response text\n    println!(\"{}\", response.choices[0].message.content);\n}\n```\n\n### Streaming Mode\n\n```rust\nuse std::env;\nuse futures_util::StreamExt;\nuse openai_api::{chat, chat_stream};\n\n#[tokio::main]\nasync fn main() {\n    // create request\n    let request = chat::CompletionRequest {\n        model: chat::Model::Gpt35Turbo,\n        temperature: Some(1.0),\n        messages: vec![\n            chat::Message {\n                role: chat::MessageRole::User,\n                content: \"Hello\".to_string(),\n            }\n        ],\n    };\n\n    // call completion endpoint\n    let mut response = chat_stream::completion(\n        env::var(\"OPENAI_API_KEY\").expect(\"environment variable OPENAI_API_KEY is not found.\"),\n        \u0026request).await.unwrap();\n\n    // receive response\n    let mut response_text = String::new();\n    while let Some(response) = response.next().await {\n        match response {\n            Ok(response) =\u003e {\n                if let Some(text_chunk) = response.choices[0].delta.content.clone() {\n                    print!(\"{text_chunk}\");\n                    response_text.push_str(\u0026text_chunk);\n                }\n            }\n            Err(err) =\u003e {\n                println!(\"{:?}\", err);\n                break;\n            }\n        }\n    }\n\n    // show response text\n    println!(\"\\nResponse Text: {response_text}\");\n}\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fakr4%2Fopenai-api","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fakr4%2Fopenai-api","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fakr4%2Fopenai-api/lists"}