Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/awaescher/ollamasharp
The easiest way to use the Ollama API in .NET
https://github.com/awaescher/ollamasharp
ai gpt ichatclient llama llamacpp llm localllama microsoft-extensions-ai ollama ollama-api streaming
Last synced: 3 days ago
JSON representation
The easiest way to use the Ollama API in .NET
- Host: GitHub
- URL: https://github.com/awaescher/ollamasharp
- Owner: awaescher
- License: mit
- Created: 2023-10-15T19:36:33.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-01-15T19:45:44.000Z (11 days ago)
- Last Synced: 2025-01-18T00:00:48.384Z (9 days ago)
- Topics: ai, gpt, ichatclient, llama, llamacpp, llm, localllama, microsoft-extensions-ai, ollama, ollama-api, streaming
- Language: C#
- Homepage: https://www.nuget.org/packages/OllamaSharp
- Size: 26.4 MB
- Stars: 703
- Watchers: 17
- Forks: 91
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# OllamaSharp 🦙
OllamaSharp provides .NET bindings for the [Ollama API](https://github.com/jmorganca/ollama/blob/main/docs/api.md), simplifying interactions with Ollama both locally and remotely.
✅ Supporting [Microsoft.Extensions.AI](https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/) and [Microsoft Semantic Kernel](https://github.com/microsoft/semantic-kernel/pull/7362)
## Features
- Ease of use: Interact with Ollama in just a few lines of code.
- API endpoint coverage: Support for all the Ollama API endpoints, including chats, embeddings, listing models, pulling and creating new models, and more.
- Real-time streaming: Stream responses directly to your application.
- Progress reporting: Get real-time progress feedback on tasks like model pulling.
- Support for [vision models](https://ollama.com/blog/vision-models) and [tools (function calling)](https://ollama.com/blog/tool-support).## Usage
OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.
The following list shows a few simple code examples.
ℹ **Try our full featured [demo application](./demo) that's included in this repository**
### Initializing
```csharp
// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";
```### Listing all models that are available locally
```csharp
var models = await ollama.ListLocalModelsAsync();
```### Pulling a model and reporting progress
```csharp
await foreach (var status in ollama.PullModelAsync("llama3.1:405b"))
Console.WriteLine($"{status.Percent}% {status.Status}");
```### Generating a completion directly into the console
```csharp
await foreach (var stream in ollama.GenerateAsync("How are you today?"))
Console.Write(stream.Response);
```### Building interactive chats
```csharp
var chat = new Chat(ollama);
while (true)
{
var message = Console.ReadLine();
await foreach (var answerToken in chat.SendAsync(message))
Console.Write(answerToken);
}
// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property
```## Usage with Microsoft.Extensions.AI
Microsoft built an abstraction library to streamline the usage of different AI providers. This is a really interesting concept if you plan to build apps that might use different providers, like ChatGPT, Claude and local models with Ollama.
I encourage you to read their accouncement [Introducing Microsoft.Extensions.AI Preview – Unified AI Building Blocks for .NET](https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/).
OllamaSharp is the first full implementation of their `IChatClient` and `IEmbeddingGenerator` that makes it possible to use Ollama just like every other chat provider.
To do this, simply use the `OllamaApiClient` as `IChatClient` instead of `IOllamaApiClient`.
```csharp
// install package Microsoft.Extensions.AI.Abstractionsprivate static IChatClient CreateChatClient(Arguments arguments)
{
if (arguments.Provider.Equals("ollama", StringComparison.OrdinalIgnoreCase))
return new OllamaApiClient(arguments.Uri, arguments.Model);
else
return new OpenAIChatClient(new OpenAI.OpenAIClient(arguments.ApiKey), arguments.Model); // ChatGPT or compatible
}
```> [!NOTE]
> `IOllamaApiClient` provides many Ollama specific methods that `IChatClient` and `IEmbeddingGenerator` miss. Because these are abstractions, `IChatClient` and `IEmbeddingGenerator` will never implement the full Ollama API specification. However, `OllamaApiClient` implements three interfaces: the native `IOllamaApiClient` and Microsoft `IChatClient` and `IEmbeddingGenerator>` which allows you to cast it to any of these two interfaces as you need them at any time.## Credits
The icon and name were reused from the amazing [Ollama project](https://github.com/jmorganca/ollama).
**I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost [mili-tan](https://github.com/mili-tan), who always keeps OllamaSharp in sync with the Ollama API. ❤**