https://github.com/lowpolycat1/rusty_ollama
Rusty Ollama is a Rust client library for interacting with the Ollama API, providing both synchronous and streaming interfaces for working with large language models.
https://github.com/lowpolycat1/rusty_ollama
ollama ollama-api ollama-client ollama-rust rust rust-api rust-crate rust-lang
Last synced: 10 months ago
JSON representation
Rusty Ollama is a Rust client library for interacting with the Ollama API, providing both synchronous and streaming interfaces for working with large language models.
- Host: GitHub
- URL: https://github.com/lowpolycat1/rusty_ollama
- Owner: LowPolyCat1
- License: mit
- Created: 2025-02-01T20:29:11.000Z (12 months ago)
- Default Branch: main
- Last Pushed: 2025-02-10T18:15:13.000Z (12 months ago)
- Last Synced: 2025-03-09T10:07:09.168Z (11 months ago)
- Topics: ollama, ollama-api, ollama-client, ollama-rust, rust, rust-api, rust-crate, rust-lang
- Language: Rust
- Homepage:
- Size: 142 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: license.txt
Awesome Lists containing this project
README
Table of Contents
- About The Project
- Built With
-
Getting Started
- Usage
- Roadmap
- Contributing
- License
- Contact
- Acknowledgments
## About The Project
Rusty Ollama is a Rust client library for interacting with the Ollama API, providing both synchronous and streaming interfaces for working with large language models.
Features:
- Simple API for text generation
- Streaming responses for real-time processing
- Context management for conversation history
- Configurable request options
- Error handling for API interactions
### Built With
[![Rust][Rust-shield]][Rust-url]
[![Reqwest][Reqwest-shield]][Reqwest-url]
[![Tokio][Tokio-shield]][Tokio-url]
[![Serde][Serde-shield]][Serde-url]
## Getting Started
### Prerequisites
- Rust 1.60+
- Cargo
- Ollama server running locally (default: )
### Installation
Add to your `Cargo.toml`:
```toml
[dependencies]
rusty_ollama = { git = "0.1.1" }
```
## Usage
### Basic Generation
```rust
use rusty_ollama::{Ollama, OllamaError};
#[tokio::main]
async fn main() -> Result<(), OllamaError> {
let mut ollama = Ollama::create_default()?;
let response = ollama.generate("Why is the sky blue?").await?;
println!("Response: {}", response.response);
Ok(())
}
```
### Streaming Responses
```rust
use rusty_ollama::{Ollama, OllamaError};
use futures::StreamExt;
#[tokio::main]
async fn main() -> Result<(), OllamaError> {
let mut ollama = Ollama::create_default()?;
let mut stream = ollama.stream_generate("Tell me a story about").await?;
while let Some(response) = stream.next().await {
match response {
Ok(chunk) => print!("{}", chunk.response),
Err(e) => eprintln!("Error: {}", e),
}
}
Ok(())
}
```
_for more examples see hte [examples](/examples/examples.md)_
## Roadmap
- [x] Basic text generation
- [x] Streaming responses
- [x] Context management
- [x] Async trait implementations
- [ ] Model management
- [ ] Advanced configuration options (Modelfile)
- [ ] Image processing
- [ ] Comprehensive documentation
- [ ] More error handling variants
See the [open issues](https://github.com/lowpolycat1/rusty_ollama/issues) for full list of proposed features.
## Contributing
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are **greatly appreciated**.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
Don't forget to give the project a star! Thanks again!
1. Fork the Project
2. Create your Feature Branch (`git checkout -b feature/AmazingFeature`)
3. Commit your Changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the Branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request
### Top contributors
## License
Distributed under the MIT License. See `LICENSE.txt` for more information.
## Contact
lowpolycat1 - @acrylic_spark (discord)
Project Link: [https://github.com/lowpolycat1/rusty_ollama](https://github.com/lowpolycat1/rusty_ollama)
## Acknowledgments
- [Ollama](https://ollama.ai) for the AI platform
- [Reqwest](https://github.com/seanmonstar/reqwest) for HTTP client
- [Tokio](https://tokio.rs) for async runtime
- [Serde](https://serde.rs) for serialization
[Rust-shield]: https://img.shields.io/badge/Rust-000000?style=for-the-badge&logo=rust&logoColor=white
[Rust-url]: https://www.rust-lang.org/
[Reqwest-shield]: https://img.shields.io/badge/Reqwest-000000?style=for-the-badge&logo=reqwest&logoColor=white
[Reqwest-url]: https://docs.rs/reqwest/latest/reqwest/
[Tokio-shield]: https://img.shields.io/badge/Tokio-000000?style=for-the-badge&logo=tokio&logoColor=white
[Tokio-url]: https://tokio.rs/
[Serde-shield]: https://img.shields.io/badge/Serde-000000?style=for-the-badge&logo=serde&logoColor=white
[Serde-url]: https://serde.rs/