{"id":16971560,"url":"https://github.com/thomastjdev/nim_openai","last_synced_at":"2025-08-24T22:48:37.800Z","repository":{"id":87657481,"uuid":"580073685","full_name":"ThomasTJdev/nim_openai","owner":"ThomasTJdev","description":"API for openAI","archived":false,"fork":false,"pushed_at":"2023-07-08T05:29:31.000Z","size":11,"stargazers_count":2,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-03-21T20:16:11.815Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Nim","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ThomasTJdev.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-12-19T16:47:34.000Z","updated_at":"2023-05-27T23:48:12.000Z","dependencies_parsed_at":"2025-01-26T14:43:39.703Z","dependency_job_id":"6b299f93-8d2e-4040-8752-d8f209f7d0f4","html_url":"https://github.com/ThomasTJdev/nim_openai","commit_stats":null,"previous_names":[],"tags_count":4,"template":false,"template_full_name":null,"purl":"pkg:github/ThomasTJdev/nim_openai","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThomasTJdev%2Fnim_openai","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThomasTJdev%2Fnim_openai/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThomasTJdev%2Fnim_openai/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThomasTJdev%2Fnim_openai/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ThomasTJdev","download_url":"https://codeload.github.com/ThomasTJdev/nim_openai/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThomasTJdev%2Fnim_openai/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":264647386,"owners_count":23643623,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-10-14T00:52:32.954Z","updated_at":"2025-07-10T20:07:03.242Z","avatar_url":"https://github.com/ThomasTJdev.png","language":"Nim","readme":"# OpenAI API\n\n\nBasic API handling for openAI.\n\nUses the specification from: [https://beta.openai.com/docs/api-reference/introduction](https://beta.openai.com/docs/api-reference/introduction)\n\n\n## Changelog\n\n### v1.0.0\nAfter GTP4 the API has breaking changes. This package still supports the legacy\ncalls, but the calling proc has changed.\n\n**Endpoint**\nDefault openAI endpoint changed to support GPT4. All call-procs now includes an optional `openAIendpoint` parameter.\n\n```nim\nconst\n  urlCompletionLegacy = \"https://api.openai.com/v1/completions\"\n  urlCompletion = \"https://api.openai.com/v1/chat/completions\"\n```\n\n```nim\nproc aiPrompt*(apiKey, prompt: string, maxTokens = 30, openAIendpoint = urlCompletion): JsonNode =\n```\n\n**Prompt options**\nThe main `aiCreateRequest` which formats the API-call has breaking changes. If\nyou still use \u003c GTP4, you need to specify the `openAIendpoint = \"legacy-call-url\"`.\n\nIf you are using \u003e= GTP4 you can rely on the default settings.\n\n\n\n\n## Authorization\n\nYou need an API key to use the API. The API key is used in the headers of the request.\n\n```nim\nlet headers = aiHeaders(apiKey)\n# Authorization: Bearer sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\n```\n\n### Environment variables\n\nOpenAI is apparently in favor of saving keys to environmental variables.\n\nIf that's also your cup of tea, you can use the following:\n\n```bash\n# On linux:\nexport openAIKey=\"sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\"\n```\n\n```nim\nimport std/os\necho getEnv(\"openAIKey\")\n# sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\n```\n\n## Results\n\nAll results are returned as a `JsonNode` object. That requires you to import\nnim's standard json library.\n\n```nim\nimport std/json\n```\n\n\n## Basic request\n\n```nim\nimport\n  std/json,\n  openai\n\nlet resp = aiPrompt(apiKey, \"Why is Nim-lang the best programming language?\", maxTokens = 50)\n\necho resp[\"choices\"][0][\"message\"][\"content\"]\n\n# Legacy:\n# echo resp[\"choices\"][0][\"text\"]\n\n# Nim-lang is the best programming language because it is a powerful, statically typed, compiled language that is designed to be fast, efficient, and expressive. It has a simple syntax that is easy to learn and understand, and it is\n```\n\n## Custom request\n\n### Current\n\n```nim\nimport\n  std/json,\n  openai\n\nlet\n  envKey = getEnv(\"openAIKey\")\n  question = \"Why is Nim-lang the best programming language?\"\n  max_tokens = 100\n  n = 3       # number of choices to return\n\nlet req = aiCreateRequest(prompt = question, max_tokens = max_tokens, n = n)\ncheck req == \"\"\"{\"model\":\"gpt-4\",\"messages\":[{\"role\":\"system\",\"content\":\"Why is Nim-lang the best programming language?\"}],\"temperature\":0.0,\"top_p\":1,\"n\":3,\"max_tokens\":100,\"presence_penalty\":0,\"frequency_penalty\":0}\"\"\"\n\nlet resp = aiGetSync(envKey, req)\n\ncheck resp[\"choices\"].len() == 3\n\necho resp[\"choices\"][1][\"message\"][\"content\"]\n\n# Whether a language is \\\"the best\\\" is subjective and depends largely on the task at hand, personal preference, or specific project requirements. However, Nim-lang possesses some qualities that can make it stand out for certain situations:\\n\\n1. Efficiency: Nim compiles to C, C++, and JavaScript, offering efficient performance close to what you would get from these languages.\\n\\n2. Expressiveness: Nim allows programmers to write high-level code that is both human-understandable and machine-optimized. This balances readability\n```\n\n### Legacy\n\n```nim\nimport\n  std/json,\n  openai\n\nlet\n  envKey = getEnv(\"openAIKey\")\n  question = \"Why is Nim-lang the best programming language?\"\n  max_tokens = 100\n  n = 3       # number of choices to return\n  best_of = 5 # number of completion (must be higher than n)\n\nlet req = aiCreateRequest(envKey, prompt = question, max_tokens = max_tokens, n = n, best_of = best_of)\ncheck req == \"\"\"{\"model\":\"text-davinci-003\",\"prompt\":\"Why is Nim-lang the best programming language?\",\"temperature\":0,\"max_tokens\":100,\"top_p\":1,\"n\":3,\"presence_penalty\":0,\"frequency_penalty\":0,\"best_of\":5}\"\"\"\n\nlet resp = aiGetSync(envKey, req)\ncheck resp[\"choices\"].len() == 3\n\necho resp[\"choices\"][1][\"text\"]\n\n# Nim-lang is the best programming language because it is a powerful, statically typed, compiled language that is designed to be fast, efficient, and expressive. It has a simple syntax, a powerful macro system, and a modern type system. Nim-lang also has a great community of developers who are constantly working to improve the language and make it even better. Additionally, Nim-lang is open source and free to use, making it an attractive option for developers.\n```\n\n\n# Public procedures\n\n## aiHeaders\n\n```nim\nproc aiHeaders*(apiKey: string): HttpHeaders =\n```\n\nAuthorization: Bearer\n\n\n____\n\n## aiGetAsync*\n\n```nim\nproc aiGetAsync*(client: AsyncHttpClient, url: string, headers: HttpHeaders, body: string): Future[JsonNode] {.async.} =\n```\n\nAPI: Async with custom client\n\n\n____\n\n## aiGetAsync*\n\n```nim\nproc aiGetAsync*(apiKey, body: string, openAIendpoint = urlCompletion): Future[JsonNode] {.async.} =\n```\n\nAPI: Async with one-time client\n\n\n____\n\n## aiGetSync*\n\n```nim\nproc aiGetSync*(client: HttpClient, url: string, headers: HttpHeaders, body: string): JsonNode =\n```\n\nAPI: Sync with custom client\n\n\n____\n\n## aiGetSync*\n\n```nim\nproc aiGetSync*(apiKey, body: string, openAIendpoint = urlCompletion): JsonNode =\n```\n\nAPI: Sync with one-time client\n\n\n____\n\n## aiCreateRequest*\n\n```nim\nproc aiCreateRequestLegacy*(\n    prompt = \"What is nim-lang\",\n    model = \"gpt-4\",\n    role = \"system\",\n    temperature = 0.0,\n    maxTokens = 30,\n    top_p = 1,\n    n = 1,\n    stop = \"\",\n    presence_penalty = 0,\n    frequency_penalty = 0,\n    user = \"\"\n): string =\n```\n\nCreate request body\n\n\n____\n\n## aiPrompt*\n\n```nim\nproc aiPrompt*(apiKey, prompt: string, maxTokens = 30, openAIendpoint = urlCompletion): JsonNode =\n```\n\nBasic prompt\n\n\n____\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fthomastjdev%2Fnim_openai","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fthomastjdev%2Fnim_openai","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fthomastjdev%2Fnim_openai/lists"}