An open API service indexing awesome lists of open source software.

https://github.com/lowpolycat1/rusty_ollama

Rusty Ollama is a Rust client library for interacting with the Ollama API, providing both synchronous and streaming interfaces for working with large language models.
https://github.com/lowpolycat1/rusty_ollama

ollama ollama-api ollama-client ollama-rust rust rust-api rust-crate rust-lang

Last synced: 10 months ago
JSON representation

Rusty Ollama is a Rust client library for interacting with the Ollama API, providing both synchronous and streaming interfaces for working with large language models.

Awesome Lists containing this project

README

          





Logo

Rusty Ollama


A Rust client for the Ollama API


Explore the docs »




Report Bug
·
Request Feature


Table of Contents


  1. About The Project

  2. Built With


  3. Getting Started


  4. Usage

  5. Roadmap

  6. Contributing

  7. License

  8. Contact

  9. Acknowledgments

## About The Project

Rusty Ollama is a Rust client library for interacting with the Ollama API, providing both synchronous and streaming interfaces for working with large language models.

Features:

- Simple API for text generation
- Streaming responses for real-time processing
- Context management for conversation history
- Configurable request options
- Error handling for API interactions

(back to top)

### Built With

[![Rust][Rust-shield]][Rust-url]
[![Reqwest][Reqwest-shield]][Reqwest-url]
[![Tokio][Tokio-shield]][Tokio-url]
[![Serde][Serde-shield]][Serde-url]

(back to top)

## Getting Started

### Prerequisites

- Rust 1.60+
- Cargo
- Ollama server running locally (default: )

### Installation

Add to your `Cargo.toml`:

```toml
[dependencies]
rusty_ollama = { git = "0.1.1" }
```

## Usage

### Basic Generation

```rust
use rusty_ollama::{Ollama, OllamaError};

#[tokio::main]
async fn main() -> Result<(), OllamaError> {
let mut ollama = Ollama::create_default()?;
let response = ollama.generate("Why is the sky blue?").await?;
println!("Response: {}", response.response);
Ok(())
}
```

### Streaming Responses

```rust
use rusty_ollama::{Ollama, OllamaError};
use futures::StreamExt;

#[tokio::main]
async fn main() -> Result<(), OllamaError> {
let mut ollama = Ollama::create_default()?;
let mut stream = ollama.stream_generate("Tell me a story about").await?;

while let Some(response) = stream.next().await {
match response {
Ok(chunk) => print!("{}", chunk.response),
Err(e) => eprintln!("Error: {}", e),
}
}

Ok(())
}
```

_for more examples see hte [examples](/examples/examples.md)_

(back to top)

## Roadmap

- [x] Basic text generation
- [x] Streaming responses
- [x] Context management
- [x] Async trait implementations
- [ ] Model management
- [ ] Advanced configuration options (Modelfile)
- [ ] Image processing
- [ ] Comprehensive documentation
- [ ] More error handling variants

See the [open issues](https://github.com/lowpolycat1/rusty_ollama/issues) for full list of proposed features.

(back to top)

## Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are **greatly appreciated**.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
Don't forget to give the project a star! Thanks again!

1. Fork the Project
2. Create your Feature Branch (`git checkout -b feature/AmazingFeature`)
3. Commit your Changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the Branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request

(back to top)

### Top contributors


contrib.rocks image

(back to top)

## License

Distributed under the MIT License. See `LICENSE.txt` for more information.

(back to top)

## Contact

lowpolycat1 - @acrylic_spark (discord)

Project Link: [https://github.com/lowpolycat1/rusty_ollama](https://github.com/lowpolycat1/rusty_ollama)

(back to top)

## Acknowledgments

- [Ollama](https://ollama.ai) for the AI platform
- [Reqwest](https://github.com/seanmonstar/reqwest) for HTTP client
- [Tokio](https://tokio.rs) for async runtime
- [Serde](https://serde.rs) for serialization

(back to top)

[Rust-shield]: https://img.shields.io/badge/Rust-000000?style=for-the-badge&logo=rust&logoColor=white
[Rust-url]: https://www.rust-lang.org/
[Reqwest-shield]: https://img.shields.io/badge/Reqwest-000000?style=for-the-badge&logo=reqwest&logoColor=white
[Reqwest-url]: https://docs.rs/reqwest/latest/reqwest/
[Tokio-shield]: https://img.shields.io/badge/Tokio-000000?style=for-the-badge&logo=tokio&logoColor=white
[Tokio-url]: https://tokio.rs/
[Serde-shield]: https://img.shields.io/badge/Serde-000000?style=for-the-badge&logo=serde&logoColor=white
[Serde-url]: https://serde.rs/