{"id":23914246,"url":"https://github.com/spongeengine/gpt4allsharp","last_synced_at":"2025-07-30T06:11:43.023Z","repository":{"id":270744015,"uuid":"910826637","full_name":"SpongeEngine/GPT4AllSharp","owner":"SpongeEngine","description":"C# client for interacting with GPT4All through its OpenAI-compatible endpoints.","archived":false,"fork":false,"pushed_at":"2025-01-14T18:17:24.000Z","size":131,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-01-20T21:06:31.347Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"C#","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/SpongeEngine.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2025-01-01T14:53:57.000Z","updated_at":"2025-01-14T18:17:27.000Z","dependencies_parsed_at":"2025-01-11T11:31:28.441Z","dependency_job_id":null,"html_url":"https://github.com/SpongeEngine/GPT4AllSharp","commit_stats":null,"previous_names":["spongeengine/gpt4allsharp"],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SpongeEngine%2FGPT4AllSharp","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SpongeEngine%2FGPT4AllSharp/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SpongeEngine%2FGPT4AllSharp/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SpongeEngine%2FGPT4AllSharp/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/SpongeEngine","download_url":"https://codeload.github.com/SpongeEngine/GPT4AllSharp/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":240364295,"owners_count":19789756,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-01-05T10:11:43.347Z","updated_at":"2025-02-23T18:41:10.509Z","avatar_url":"https://github.com/SpongeEngine.png","language":"C#","readme":"# GPT4AllSharp\n[![NuGet](https://img.shields.io/nuget/v/SpongeEngine.GPT4AllSharp.svg)](https://www.nuget.org/packages/SpongeEngine.GPT4AllSharp)\n[![NuGet Downloads](https://img.shields.io/nuget/dt/SpongeEngine.GPT4AllSharp.svg)](https://www.nuget.org/packages/SpongeEngine.GPT4AllSharp)\n[![Tests](https://github.com/SpongeEngine/GPT4AllSharp/actions/workflows/test.yml/badge.svg)](https://github.com/SpongeEngine/GPT4AllSharp/actions/workflows/test.yml)\n[![License](https://img.shields.io/github/license/SpongeEngine/GPT4AllSharp)](LICENSE)\n[![.NET](https://img.shields.io/badge/.NET-6.0%20%7C%207.0%20%7C%208.0%2B-512BD4)](https://dotnet.microsoft.com/download)\n\nC# client for interacting with GPT4All through its OpenAI-compatible endpoints.\n\n## Features\n- Complete support for GPT4All's native API\n- OpenAI-compatible API endpoint support\n- Streaming text generation\n- Comprehensive configuration options\n- Built-in error handling and logging\n- Cross-platform compatibility\n- Full async/await support\n\n📦 [View Package on NuGet](https://www.nuget.org/packages/SpongeEngine.GPT4AllSharp)\n\n## Installation\nInstall via NuGet:\n```bash\ndotnet add package SpongeEngine.GPT4AllSharp\n```\n\n## Quick Start\n\n### Using Native API\n```csharp\nusing LocalAI.NET.KoboldCpp.Client;\nusing LocalAI.NET.KoboldCpp.Models;\n\n// Configure the client\nvar options = new KoboldCppOptions\n{\n    BaseUrl = \"http://localhost:5001\",\n    UseGpu = true,\n    ContextSize = 2048\n};\n\n// Create client instance\nusing var client = new KoboldCppClient(options);\n\n// Generate completion\nvar request = new KoboldCppRequest\n{\n    Prompt = \"Write a short story about a robot:\",\n    MaxLength = 200,\n    Temperature = 0.7f,\n    TopP = 0.9f\n};\n\nvar response = await client.GenerateAsync(request);\nConsole.WriteLine(response.Results[0].Text);\n\n// Stream completion\nawait foreach (var token in client.GenerateStreamAsync(request))\n{\n    Console.Write(token);\n}\n```\n\n### Using OpenAI-Compatible API\n```csharp\nvar options = new KoboldCppOptions\n{\n    BaseUrl = \"http://localhost:5001\",\n    UseOpenAiApi = true\n};\n\nusing var client = new KoboldCppClient(options);\n\n// Simple completion\nstring response = await client.CompleteAsync(\n    \"Write a short story about:\",\n    new CompletionOptions\n    {\n        MaxTokens = 200,\n        Temperature = 0.7f,\n        TopP = 0.9f\n    });\n\n// Stream completion\nawait foreach (var token in client.StreamCompletionAsync(\n    \"Once upon a time...\",\n    new CompletionOptions { MaxTokens = 200 }))\n{\n    Console.Write(token);\n}\n```\n\n## Configuration Options\n\n### Basic Options\n```csharp\nvar options = new KoboldCppOptions\n{\n    BaseUrl = \"http://localhost:5001\",    // KoboldCpp server URL\n    ApiKey = \"optional_api_key\",          // Optional API key\n    TimeoutSeconds = 600,                 // Request timeout\n    ContextSize = 2048,                   // Maximum context size\n    UseGpu = true,                        // Enable GPU acceleration\n    UseOpenAiApi = false                  // Use OpenAI-compatible API\n};\n```\n\n### Advanced Generation Parameters\n```csharp\nvar request = new KoboldCppRequest\n{\n    Prompt = \"Your prompt here\",\n    MaxLength = 200,                      // Maximum tokens to generate\n    MaxContextLength = 2048,              // Maximum context length\n    Temperature = 0.7f,                   // Randomness (0.0-1.0)\n    TopP = 0.9f,                          // Nucleus sampling threshold\n    TopK = 40,                            // Top-K sampling\n    TopA = 0.0f,                          // Top-A sampling\n    Typical = 1.0f,                       // Typical sampling\n    Tfs = 1.0f,                          // Tail-free sampling\n    RepetitionPenalty = 1.1f,            // Repetition penalty\n    RepetitionPenaltyRange = 64,         // Penalty range\n    StopSequences = new List\u003cstring\u003e { \"\\n\" },  // Stop sequences\n    Stream = false,                       // Enable streaming\n    TrimStop = true,                     // Trim stop sequences\n    MirostatMode = 0,                    // Mirostat sampling mode\n    MirostatTau = 5.0f,                  // Mirostat target entropy\n    MirostatEta = 0.1f                   // Mirostat learning rate\n};\n```\n\n## Error Handling\n```csharp\ntry\n{\n    var response = await client.GenerateAsync(request);\n}\ncatch (KoboldCppException ex)\n{\n    Console.WriteLine($\"KoboldCpp error: {ex.Message}\");\n    Console.WriteLine($\"Provider: {ex.Provider}\");\n    if (ex.StatusCode.HasValue)\n    {\n        Console.WriteLine($\"Status code: {ex.StatusCode}\");\n    }\n    if (ex.ResponseContent != null)\n    {\n        Console.WriteLine($\"Response content: {ex.ResponseContent}\");\n    }\n}\ncatch (Exception ex)\n{\n    Console.WriteLine($\"General error: {ex.Message}\");\n}\n```\n\n## Logging\nThe client supports Microsoft.Extensions.Logging:\n\n```csharp\nILogger logger = LoggerFactory\n    .Create(builder =\u003e builder\n        .AddConsole()\n        .SetMinimumLevel(LogLevel.Debug))\n    .CreateLogger\u003cKoboldCppClient\u003e();\n\nvar client = new KoboldCppClient(options, logger);\n```\n\n## JSON Serialization\nCustom JSON settings can be provided:\n\n```csharp\nvar jsonSettings = new JsonSerializerSettings\n{\n    NullValueHandling = NullValueHandling.Ignore,\n    DefaultValueHandling = DefaultValueHandling.Ignore\n};\n\nvar client = new KoboldCppClient(options, jsonSettings: jsonSettings);\n```\n\n## Testing\nThe library includes both unit and integration tests. Integration tests require a running KoboldCpp server.\n\nTo run the tests:\n```bash\ndotnet test\n```\n\nTo configure the test environment:\n```csharp\n// Set environment variables for testing\nEnvironment.SetEnvironmentVariable(\"KOBOLDCPP_BASE_URL\", \"http://localhost:5001\");\nEnvironment.SetEnvironmentVariable(\"KOBOLDCPP_OPENAI_BASE_URL\", \"http://localhost:5001/v1\");\n```\n\n## License\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## Contributing\nContributions are welcome! Please feel free to submit a Pull Request.\n\n## Support\nFor issues and feature requests, please use the [GitHub issues page](https://github.com/SpongeEngine/LocalAI.NET.KoboldCpp/issues).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fspongeengine%2Fgpt4allsharp","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fspongeengine%2Fgpt4allsharp","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fspongeengine%2Fgpt4allsharp/lists"}