{"id":15101850,"url":"https://github.com/ginger-code/llamas","last_synced_at":"2026-02-23T04:31:54.301Z","repository":{"id":246767595,"uuid":"819622147","full_name":"ginger-code/Llamas","owner":"ginger-code","description":"Use Ollama from .NET 8 with a handwritten client, and even automatically spin up docker Ollama containers from code","archived":false,"fork":false,"pushed_at":"2024-07-13T23:49:00.000Z","size":182,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-02-25T01:05:52.070Z","etag":null,"topics":["ai","csharp","dotnet","generative","library","llm","nuget","ollama"],"latest_commit_sha":null,"homepage":"","language":"C#","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ginger-code.png","metadata":{"files":{"readme":"README.md","changelog":"ChangeNotes","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-06-24T22:09:55.000Z","updated_at":"2024-09-29T08:29:07.000Z","dependencies_parsed_at":null,"dependency_job_id":"eacb7253-bf67-46da-b8f3-aa1a59bac6bf","html_url":"https://github.com/ginger-code/Llamas","commit_stats":{"total_commits":7,"total_committers":2,"mean_commits":3.5,"dds":0.2857142857142857,"last_synced_commit":"cfc7bc2bf4b327b47af7c3148c778e1712a6fdf9"},"previous_names":["ginger-code/llamas"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ginger-code%2FLlamas","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ginger-code%2FLlamas/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ginger-code%2FLlamas/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ginger-code%2FLlamas/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ginger-code","download_url":"https://codeload.github.com/ginger-code/Llamas/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":241970989,"owners_count":20050753,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","csharp","dotnet","generative","library","llm","nuget","ollama"],"created_at":"2024-09-25T18:41:26.530Z","updated_at":"2025-10-20T11:56:10.655Z","avatar_url":"https://github.com/ginger-code.png","language":"C#","readme":"# Llamas\n\n---\n\n![Llamas NuGet Version](https://img.shields.io/nuget/v/Llamas?style=for-the-badge\u0026logo=nuget\u0026label=Llamas)\n![Llamas.Abstractions NuGet Version](https://img.shields.io/nuget/v/Llamas.Abstractions?style=for-the-badge\u0026logo=nuget\u0026label=Llamas.Abstractions)\n![Llamas.Container NuGet Version](https://img.shields.io/nuget/v/Llamas.Container?style=for-the-badge\u0026logo=nuget\u0026label=Llamas.Container)\n\n## Table of Contents\n\n- [About](#about)\n- [Usage](#usage)\n- [Dependency Injection](#dependency-injection)\n- [Testing](#testing)\n    - [Unit Tests](#unit-tests)\n    - [Integration Tests](#integration-tests)\n- [`Llamas.Abstractions`](#llamasabstractions)\n- [`Llamas.Container`](#llamascontainer)\n    - [Container Dependency Injection](#container-dependency-injection)\n\n## About\n\n`Llamas` is a .NET client library for [Ollama](https://github.com/ollama/ollama), enabling .NET developers to interact\nwith and leverage large language models.\nIf using the [Llamas.Container](#llamascontainer) package, developers can also host pre-configured instances of Ollama\nin docker from their own .NET code either directly or using the simple DI patterns they are accustomed to with no\nconfiguration knowledge needed.\n\n`Llamas` is a handwritten client library focused on ergonomics and performance, taking full advantage\nof `IAsyncEnumerable` and `ndjson` to handle and propagate live-streaming data.\nThis client handles the functionality exposed by\nthe [Ollama API](https://github.com/ollama/ollama/blob/main/docs/api.md) and therefore requires an instance of Ollama to\nbe accessible over the local network, or hosted using the `Llamas.Container` package.\n\n## Usage\n\nThe `IOllamaClient` interface describes the functionality of the Ollama client, such as listing models installed\nlocally, pulling new models, generating chat completions, generating embeddings, pushing models, and retrieving details\nabout models.\n`IOllamaBlobClient` contains definitions for blob functionality including checking for the existence of and creation of\na data blob.\n\nExamples of client use can be found both in the `examples` folder, as well as the integration test suite.\n\n## Dependency Injection\n\n`Llamas` comes with several ways to set up a client using the .NET hosting abstractions.\n\nOne can inject a client configuration and the client explicitly, or using one of the helper extension methods\non `IServiceCollection`.\n\n```csharp\nservices.AddHttpClient(); // IHttpClientFactory and HttpClient can both be injected. Otherwise, new HttpClient will be created\n\n#region Manual Addition\n\n/// Add the services manually\nvar clientConfig = new OllamaClientConfiguration();\nservices.AddSingleton(clientConfig);\nservices.AddSingleton\u003cIOllamaClient, OllamaClient\u003e();\n#endregion\n\n\n#region From Configuration\n\n// Automatically inject the configuration and a client\nvar clientConfig = new OllamaClientConfiguration();\nservices.AddOllamaClient(clientConfig);\n\n#endregion\n\n#region With Configuration Builder\n\n// Use the lambda parameter to change the default configuration values\nservices.AddOllamaClient(clientConfig =\u003e clientConfig with {Port = 8082});\n\n#endregion\n```\n\n## Testing\n\n### Unit Tests\n\nUnit tests are defined for any functionality which is atomic and testable without mocking a server connection.\nIn practice, this applies to custom serialization, stream hashing, etc..\n\n### Integration Tests\n\nIntegration tests are defined for all core client functionality, and are supported by a hosted instance of Ollama\nusing `Llamas.Container`. These tests are ordered, to ensure stateful changes are accounted for.\n\n*Warning: Running integration tests will execute an LLM on your graphics device. It is not recommended to run these\ntests on a machine using an integrated graphics device or on a battery. The model used is small (\u003c1GB) but will still\nheat up your PC.*\n\n## Llamas.Abstractions\n\n`Llamas.Abstractions` contains interfaces for `Llamas` Ollama client and blob support, as well as the exported types\nneeded for parameters and results.\nThis assembly is provided separately to allow for integration purposes such as DI and client (e.g. Blazor) references.\n\n## Llamas.Container\n\n`Llamas.Container` is a library providing Ollama self-hosting capabilities to .NET applications.\nThis assembly provides the logic needed to automatically hook into a local docker service, pull the ollama container,\nconfigure the necessary devices, and run it transitively.\n\nSupport for persistence and further configuration is planned.\n\n### Container Dependency Injection\n\nLike `Llamas`, `Llamas.Container` extends `IServiceCollection` with methods for easy injection.\nThese allow for hosting with or without a client, and can be injected using the same configuration as the client for\nsimplicity.\n\n```csharp\n/// Add a container based on the client configuration\nvar clientConfig = new OllamaClientConfiguration();\nservices.AddOllamaClient(clientConfig);\nservices.AddOllamaContainerService();\n```","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fginger-code%2Fllamas","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fginger-code%2Fllamas","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fginger-code%2Fllamas/lists"}