{"id":23983411,"url":"https://github.com/spongeengine/lmstudiosharp","last_synced_at":"2025-02-24T23:45:19.857Z","repository":{"id":270487704,"uuid":"909827327","full_name":"SpongeEngine/LMStudioSharp","owner":"SpongeEngine","description":"C# client for LM Studio.","archived":false,"fork":false,"pushed_at":"2025-02-08T22:35:34.000Z","size":120,"stargazers_count":3,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-02-24T23:45:15.485Z","etag":null,"topics":["ai","ai-client","csharp","dotnet","language-models","llm","llm-client","lm-studio","lmstudio","local-llm","local-llm-integration","local-llms","offline-ai","openai-compatible-api","self-hosted-ai"],"latest_commit_sha":null,"homepage":"https://www.nuget.org/packages/SpongeEngine.LMStudioSharp","language":"C#","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/SpongeEngine.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-12-29T22:16:56.000Z","updated_at":"2025-02-12T13:13:00.000Z","dependencies_parsed_at":"2025-01-02T13:24:02.042Z","dependency_job_id":"227b79cb-b4ab-4c98-8789-bde4ee78246e","html_url":"https://github.com/SpongeEngine/LMStudioSharp","commit_stats":null,"previous_names":["spongeengine/localai.net.lmstudio","spongeengine/lmstudiosharp"],"tags_count":7,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SpongeEngine%2FLMStudioSharp","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SpongeEngine%2FLMStudioSharp/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SpongeEngine%2FLMStudioSharp/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SpongeEngine%2FLMStudioSharp/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/SpongeEngine","download_url":"https://codeload.github.com/SpongeEngine/LMStudioSharp/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":240576485,"owners_count":19823293,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","ai-client","csharp","dotnet","language-models","llm","llm-client","lm-studio","lmstudio","local-llm","local-llm-integration","local-llms","offline-ai","openai-compatible-api","self-hosted-ai"],"created_at":"2025-01-07T12:16:27.306Z","updated_at":"2025-02-24T23:45:19.850Z","avatar_url":"https://github.com/SpongeEngine.png","language":"C#","readme":"# LMStudioSharp\n[![NuGet](https://img.shields.io/nuget/v/SpongeEngine.LMStudioSharp.svg)](https://www.nuget.org/packages/SpongeEngine.LMStudioSharp)\n[![NuGet Downloads](https://img.shields.io/nuget/dt/SpongeEngine.LMStudioSharp.svg)](https://www.nuget.org/packages/SpongeEngine.LMStudioSharp)\n[![License](https://img.shields.io/github/license/SpongeEngine/LMStudioSharp)](LICENSE)\n[![.NET](https://img.shields.io/badge/.NET-6.0%20%7C%207.0%20%7C%208.0%2B-512BD4)](https://dotnet.microsoft.com/download)\n\nC# client for LM Studio.\n\n## Features\n- Complete support for LM Studio's native API\n- Text completion and chat completion\n- Streaming support for both completion types\n- Text embeddings generation\n- Model information retrieval\n- Comprehensive configuration options\n- Built-in error handling and logging\n- Cross-platform compatibility\n- Full async/await support\n\n📦 [View Package on NuGet](https://www.nuget.org/packages/SpongeEngine.LMStudioSharp)\n\n## Installation\nInstall via NuGet:\n```bash\ndotnet add package SpongeEngine.LMStudioSharp\n```\n\n## Quick Start\n\n```csharp\nusing SpongeEngine.LMStudioSharp;\nusing SpongeEngine.LMStudioSharp.Models.Completion;\nusing SpongeEngine.LMStudioSharp.Models.Chat;\n\n// Configure the client\nvar options = new LmStudioClientOptions\n{\n    HttpClient = new HttpClient\n    {\n        BaseAddress = new Uri(\"http://localhost:1234\")\n    }\n};\n\n// Create client instance\nusing var client = new LmStudioSharpClient(options);\n\n// List available models\nvar models = await client.ListModelsAsync();\nvar modelId = models.Data[0].Id;\n\n// Text completion\nvar completionRequest = new CompletionRequest\n{\n    Model = modelId,\n    Prompt = \"Write a short story about a robot:\",\n    MaxTokens = 200,\n    Temperature = 0.7f,\n    TopP = 0.9f\n};\n\nvar completionResponse = await client.CompleteAsync(completionRequest);\nConsole.WriteLine(completionResponse.Choices[0].GetText());\n\n// Chat completion\nvar chatRequest = new ChatRequest\n{\n    Model = modelId,\n    Messages = new List\u003cChatMessage\u003e\n    {\n        new() { Role = \"system\", Content = \"You are a helpful assistant.\" },\n        new() { Role = \"user\", Content = \"Tell me a joke about programming.\" }\n    },\n    Temperature = 0.7f\n};\n\nvar chatResponse = await client.ChatCompleteAsync(chatRequest);\nConsole.WriteLine(chatResponse.Choices[0].GetText());\n\n// Stream completion\nawait foreach (var token in client.StreamCompletionAsync(completionRequest))\n{\n    Console.Write(token);\n}\n```\n\n## Configuration Options\n\n### Client Options\n```csharp\nvar options = new LmStudioClientOptions\n{\n    HttpClient = new HttpClient\n    {\n        BaseAddress = new Uri(\"http://localhost:1234\")\n    },                                    // Configure HttpClient with base address\n    JsonSerializerOptions = new JsonSerializerOptions(), // Optional JSON options\n    Logger = loggerInstance              // Optional ILogger instance\n};\n```\n\n### Completion Request Parameters\n```csharp\nvar request = new CompletionRequest\n{\n    Model = \"model-id\",\n    Prompt = \"Your prompt here\",\n    MaxTokens = 200,                      // Maximum tokens to generate\n    Temperature = 0.7f,                   // Randomness (0.0-1.0)\n    TopP = 0.9f,                         // Nucleus sampling threshold\n    Stop = new[] { \"\\n\" },               // Stop sequences\n    Stream = false                        // Enable streaming\n};\n```\n\n## Error Handling\n```csharp\ntry\n{\n    var response = await client.CompleteAsync(request);\n}\ncatch (LlmSharpException ex)\n{\n    Console.WriteLine($\"LM Studio error: {ex.Message}\");\n    if (ex.StatusCode.HasValue)\n    {\n        Console.WriteLine($\"Status code: {ex.StatusCode}\");\n    }\n    Console.WriteLine($\"Response content: {ex.ResponseContent}\");\n}\ncatch (Exception ex)\n{\n    Console.WriteLine($\"General error: {ex.Message}\");\n}\n```\n\n## Logging\nThe client supports Microsoft.Extensions.Logging:\n\n```csharp\nvar logger = LoggerFactory\n    .Create(builder =\u003e builder\n        .AddConsole()\n        .SetMinimumLevel(LogLevel.Debug))\n    .CreateLogger\u003cLmStudioSharpClient\u003e();\n\nvar options = new LmStudioClientOptions\n{\n    HttpClient = new HttpClient\n    {\n        BaseAddress = new Uri(\"http://localhost:1234\")\n    },\n    Logger = logger\n};\nvar client = new LmStudioSharpClient(options);\n```\n\n## JSON Serialization\nCustom JSON options can be provided:\n\n```csharp\nvar jsonOptions = new JsonSerializerOptions\n{\n    DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull\n};\n\nvar options = new LmStudioClientOptions\n{\n    HttpClient = new HttpClient\n    {\n        BaseAddress = new Uri(\"http://localhost:1234\")\n    },\n    JsonSerializerOptions = jsonOptions\n};\nvar client = new LmStudioSharpClient(options);\n```\n\n## Testing\nThe library includes both unit and integration tests. Integration tests require a running LM Studio server.\n\nTo run the tests:\n```bash\ndotnet test\n```\n\nTo configure the test environment:\n```bash\n# Set environment variables for testing\nexport LMSTUDIO_BASE_URL=\"http://localhost:1234\"\n```\n\n## License\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## Contributing\nContributions are welcome! Please feel free to submit a Pull Request.\n\n## Support\nFor issues and feature requests, please use the [GitHub issues page](https://github.com/SpongeEngine/LMStudioSharp/issues).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fspongeengine%2Flmstudiosharp","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fspongeengine%2Flmstudiosharp","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fspongeengine%2Flmstudiosharp/lists"}