Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pezzolabs/unillm
🦄 Consume any LLM from any provider, using the OpenAI API
https://github.com/pezzolabs/unillm
ai api gpt-3 gpt-4 hacktoberfest javascript langchain llm llmops monitoring nodejs observability sdk typescript
Last synced: about 1 month ago
JSON representation
🦄 Consume any LLM from any provider, using the OpenAI API
- Host: GitHub
- URL: https://github.com/pezzolabs/unillm
- Owner: pezzolabs
- License: mit
- Created: 2023-10-10T21:58:03.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2023-10-23T01:21:13.000Z (about 1 year ago)
- Last Synced: 2024-10-06T10:07:16.529Z (about 1 month ago)
- Topics: ai, api, gpt-3, gpt-4, hacktoberfest, javascript, langchain, llm, llmops, monitoring, nodejs, observability, sdk, typescript
- Language: TypeScript
- Homepage: https://docs.unillm.ai
- Size: 4.23 MB
- Stars: 24
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
UniLLM allows you to call any LLM using the OpenAI API, with 100% type safety.
# Benefits
- ✨ Integrate with any provider and model using the OpenAI API
- 💬 Consistent chatCompletion responses and logs across all models and providers
- 💯 Type safety across all providers and models
- 🔁 Seamlessly switch between LLMs without rewriting your codebase
- ✅ If you write tests for your service, you only need to test it once
- 🔜 (Coming Soon) Request caching and rate limiting
- 🔜 (Coming Soon) Cost monitoring and alerting# Usage
## [✨ Check our interactive documentation ✨](https://docs.unillm.ai)
## 💬 Chat Completions
With UniLLM, you can use chat completions even for providers/models that don't natively support it (e.g. Anthropic).
```bash
npm i unillm
``````ts
import { UniLLM } from 'unillm';const unillm = new UniLLM();
// OpenAI
const response = await unillm.createChatCompletion("openai/gpt-3.5-turbo", { messages: ... });
const response = await unillm.createChatCompletion("openai/gpt-4", { messages: ... });// Anthropic
const response = await unillm.createChatCompletion("anthropic/claude-2", { messages: ... });
const response = await unillm.createChatCompletion("anthropic/claude-1-instant", { messages: ... });// Azure OpenAI
const response = await unillm.createChatCompletion("azure/openai/", { messages: ... });// More coming soon!
```Want to see more examples? Check out the **[interactive docs](https://docs.unillm.ai)**.
## ⚡️ Streaming
To enable streaming, simply provide `stream: true` in the options object. Here is an example:
```ts
const response = await unillm.createChatCompletion("openai/gpt-3.5-turbo", {
messages: ...,
stream: true
});
```Want to see more examples? Check out the **[interactive docs](https://docs.unillm.ai)**.
# Contributing
We welcome contributions from the community! Please feel free to submit pull requests or create issues for bugs or feature suggestions.
If you want to contribute but not sure how, join our [Discord](https://discord.gg/XcEVPePwn2) and we'll be happy to help you out!
Please check out [CONTRIBUTING.md](CONTRIBUTING.md) before contributing.
# License
This repository's source code is available under the [MIT](LICENSE).