https://github.com/dongri/llm-api-rs
A Rust library unifying multiple LLM providers
https://github.com/dongri/llm-api-rs
anthropic deepseek gemini ollama openai xai
Last synced: 26 days ago
JSON representation
A Rust library unifying multiple LLM providers
- Host: GitHub
- URL: https://github.com/dongri/llm-api-rs
- Owner: dongri
- Created: 2025-01-29T14:08:08.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-01-30T10:34:44.000Z (3 months ago)
- Last Synced: 2025-03-28T12:58:59.588Z (about 1 month ago)
- Topics: anthropic, deepseek, gemini, ollama, openai, xai
- Language: Rust
- Homepage: https://crates.io/crates/llm-api-rs
- Size: 18.6 KB
- Stars: 2
- Watchers: 1
- Forks: 1
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# llm-api-rs
llm-api-rs is a Rust library that lets you use multiple LLM Provider in a single project: OpenAI, Anthropic (Claude), DeepSeek, xAI and Google (Gemini). you can easily create chat or text completion requests without multiplying structures and crates.
## Installation
`Cargo.toml`:
```toml
[dependencies]
llm-api-rs = "0.1.0"
```## Usage
See examples in the `examples` directory.
## Example
```rust
use llm_api_rs::{
core::{ChatCompletionRequest, ChatMessage},
providers::openai::OpenAI,
LlmProvider,
};#[tokio::main]
async fn main() {
let api_key =
std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY environment variable not set");let client = OpenAI::new(api_key);
let request = ChatCompletionRequest {
model: "gpt-3.5-turbo".to_string(),
messages: vec![
ChatMessage {
role: "system".to_string(),
content: "You are a helpful assistant.".to_string(),
},
ChatMessage {
role: "user".to_string(),
content: "Hello!".to_string(),
},
],
temperature: Some(0.7),
max_tokens: Some(50),
};match client.chat_completion(request).await {
Ok(response) => {
for choice in response.choices {
println!("Response: {}", choice.message.content);
}
}
Err(e) => eprintln!("Error: {}", e),
}
}
```