{"id":15175626,"url":"https://github.com/pezzolabs/unillm","last_synced_at":"2025-10-06T10:32:09.862Z","repository":{"id":202458348,"uuid":"703245100","full_name":"pezzolabs/UniLLM","owner":"pezzolabs","description":"🦄 Consume any LLM from any provider, using the OpenAI API ","archived":false,"fork":false,"pushed_at":"2023-10-23T01:21:13.000Z","size":4431,"stargazers_count":25,"open_issues_count":0,"forks_count":2,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-01-24T23:58:12.172Z","etag":null,"topics":["ai","api","gpt-3","gpt-4","hacktoberfest","javascript","langchain","llm","llmops","monitoring","nodejs","observability","sdk","typescript"],"latest_commit_sha":null,"homepage":"https://docs.unillm.ai","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/pezzolabs.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2023-10-10T21:58:03.000Z","updated_at":"2024-11-05T18:27:23.000Z","dependencies_parsed_at":"2023-11-04T21:31:40.687Z","dependency_job_id":null,"html_url":"https://github.com/pezzolabs/UniLLM","commit_stats":{"total_commits":39,"total_committers":3,"mean_commits":13.0,"dds":"0.10256410256410253","last_synced_commit":"b129da5c077000baeb2c66e3f3ceafdc9ae3f22a"},"previous_names":["pezzolabs/unillm"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pezzolabs%2FUniLLM","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pezzolabs%2FUniLLM/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pezzolabs%2FUniLLM/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pezzolabs%2FUniLLM/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/pezzolabs","download_url":"https://codeload.github.com/pezzolabs/UniLLM/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":235519884,"owners_count":19003201,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","api","gpt-3","gpt-4","hacktoberfest","javascript","langchain","llm","llmops","monitoring","nodejs","observability","sdk","typescript"],"created_at":"2024-09-27T12:39:43.403Z","updated_at":"2025-10-06T10:32:09.150Z","avatar_url":"https://github.com/pezzolabs.png","language":"TypeScript","readme":"\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://docs.unillm.ai/\" target=\"_blank\"\u003e\n    \u003cimg src=\"https://cdn.pezzo.ai/unillm/logo-light-mode.svg\" alt=\"logo\" width=\"280\"\u003e\n  \u003c/a\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003cstrong\u003eUniLLM allows you to call any LLM using the OpenAI API, with 100% type safety.\u003c/strong\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n\u003cimg src=\"https://github.com/pezzolabs/unillm/actions/workflows/ci.yaml/badge.svg\" /\u003e\n\u003ca href=\"CODE_OF_CONDUCT.md\"\u003e\n  \u003cimg src=\"https://img.shields.io/badge/Contributor%20Covenant-v2.0%20adopted-ff69b4.svg\" alt=\"Contributor Covenant\"\u003e\n\u003c/a\u003e\n\u003ca href=\"https://opensource.org/licenses/MIT\"\u003e\n  \u003cimg src=\"https://img.shields.io/badge/License-MIT-blue.svg\" alt=\"License\"\u003e\n\u003c/a\u003e\n\u003ca href=\"https://www.npmjs.com/package/unillm\" target=\"_blank\"\u003e\n  \u003cimg src=\"https://img.shields.io/badge/npm-unillm-green\"\u003e\n\u003c/a\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://cdn.pezzo.ai/unillm/animated-demo.gif\" width=\"540\" /\u003e\n\u003c/p\u003e\n\n# Benefits\n\n- ✨ Integrate with any provider and model using the OpenAI API\n- 💬 Consistent chatCompletion responses and logs across all models and providers\n- 💯 Type safety across all providers and models\n- 🔁 Seamlessly switch between LLMs without rewriting your codebase\n- ✅ If you write tests for your service, you only need to test it once\n- 🔜 (Coming Soon) Request caching and rate limiting\n- 🔜 (Coming Soon) Cost monitoring and alerting\n\n# Usage\n\n## [✨ Check our interactive documentation ✨](https://docs.unillm.ai)\n\n## 💬 Chat Completions\n\nWith UniLLM, you can use chat completions even for providers/models that don't natively support it (e.g. Anthropic).\n\n```bash\nnpm i unillm\n```\n\n```ts\nimport { UniLLM } from 'unillm';\n\nconst unillm = new UniLLM();\n\n// OpenAI\nconst response = await unillm.createChatCompletion(\"openai/gpt-3.5-turbo\", { messages: ... });\nconst response = await unillm.createChatCompletion(\"openai/gpt-4\", { messages: ... });\n\n// Anthropic\nconst response = await unillm.createChatCompletion(\"anthropic/claude-2\", { messages: ... });\nconst response = await unillm.createChatCompletion(\"anthropic/claude-1-instant\", { messages: ... });\n\n// Azure OpenAI\nconst response = await unillm.createChatCompletion(\"azure/openai/\u003cdeployment-name\u003e\", { messages: ... });\n\n// More coming soon!\n```\n\nWant to see more examples? Check out the **[interactive docs](https://docs.unillm.ai)**.\n\n## ⚡️ Streaming\n\nTo enable streaming, simply provide `stream: true` in the options object. Here is an example:\n\n```ts\nconst response = await unillm.createChatCompletion(\"openai/gpt-3.5-turbo\", {\n  messages: ...,\n  stream: true\n});\n```\n\nWant to see more examples? Check out the **[interactive docs](https://docs.unillm.ai)**.\n\n# Contributing\n\nWe welcome contributions from the community! Please feel free to submit pull requests or create issues for bugs or feature suggestions.\n\nIf you want to contribute but not sure how, join our [Discord](https://discord.gg/XcEVPePwn2) and we'll be happy to help you out!\n\nPlease check out [CONTRIBUTING.md](CONTRIBUTING.md) before contributing.\n\n# License\n\nThis repository's source code is available under the [MIT](LICENSE).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpezzolabs%2Funillm","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpezzolabs%2Funillm","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpezzolabs%2Funillm/lists"}