https://github.com/lunary-ai/abso
TypeScript SDK to call 100+ LLM Providers in OpenAI format.
https://github.com/lunary-ai/abso
ai anthropic azure-openai bedrock gemini grok groq groq-ai llm mistral ollama openai openrouter voyage xai
Last synced: 28 days ago
JSON representation
TypeScript SDK to call 100+ LLM Providers in OpenAI format.
- Host: GitHub
- URL: https://github.com/lunary-ai/abso
- Owner: lunary-ai
- License: mit
- Created: 2025-02-08T12:27:17.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2025-02-15T16:49:30.000Z (10 months ago)
- Last Synced: 2025-02-15T16:56:10.216Z (10 months ago)
- Topics: ai, anthropic, azure-openai, bedrock, gemini, grok, groq, groq-ai, llm, mistral, ollama, openai, openrouter, voyage, xai
- Language: TypeScript
- Homepage: https://abso.ai
- Size: 147 KB
- Stars: 14
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- trackawesomelist - lunary-ai/abso (⭐16)
- awesome-generative-ai - lunary-ai/abso
README
**Drop-in replacement for OpenAI**
[](https://badge.fury.io/js/abso-ai)  
**Abso** provides a unified interface for calling various LLMs while maintaining full type safety.
## Features
- **OpenAI-compatible API 🔁** (drop in replacement)
- **Call any LLM provider** (OpenAI, Anthropic, Groq, Ollama, etc.)
- **Lightweight & Fast ⚡**
- **Embeddings support 🧮**
- **Unified tool calling 🛠️**
- **Tokenizer and cost calculation (soon) 🔢**
- **Smart routing (soon)**
## Providers
| Provider | Chat | Streaming | Tool Calling | Embeddings | Tokenizer | Cost Calculation |
| ---------- | ---- | --------- | ------------ | ---------- | --------- | ---------------- |
| OpenAI | ✅ | ✅ | ✅ | ✅ | 🚧 | 🚧 |
| Anthropic | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| xAI Grok | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| Mistral | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| Groq | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| Ollama | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| OpenRouter | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| Voyage | ❌ | ❌ | ❌ | ✅ | ❌ | ❌ |
| Azure | 🚧 | 🚧 | 🚧 | 🚧 | ❌ | 🚧 |
| Bedrock | 🚧 | 🚧 | 🚧 | 🚧 | ❌ | 🚧 |
| Gemini | ✅ | ✅ | ✅ | ❌ | 🚧 | ❌ |
| DeepSeek | ✅ | ✅ | ✅ | ❌ | 🚧 | ❌ |
| Perplexity | ✅ | ✅ | ❌ | ❌ | 🚧 | ❌ |
## Installation
```bash
npm install abso-ai
```
## Usage
```ts
import { abso } from "abso-ai"
const result = await abso.chat.completions.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "gpt-4o",
})
console.log(result.choices[0].message.content)
```
## Manually selecting a provider
Abso tries to infer the correct provider for a given model, but you can also manually select a provider.
```ts
const result = await abso.chat.completions.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "openai/gpt-4o",
provider: "openrouter",
})
console.log(result.choices[0].message.content)
```
## Streaming
```ts
const stream = await abso.chat.completions.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "gpt-4o",
stream: true,
})
for await (const chunk of stream) {
console.log(chunk)
}
// Helper to get the final result
const fullResult = await stream.finalChatCompletion()
console.log(fullResult)
```
## Embeddings
```ts
const embeddings = await abso.embeddings.create({
model: "text-embedding-3-small",
input: ["A cat was playing with a ball on the floor"],
})
console.log(embeddings.data[0].embedding)
```
## Tokenizers (soon)
```ts
const tokens = await abso.chat.tokenize({
messages: [{ role: "user", content: "Hello, world!" }],
model: "gpt-4o",
})
console.log(`${tokens.count} tokens`)
```
## Custom Providers
You can also configure built-in providers directly by passing a configuration object with provider names as keys when instantiating Abso:
```ts
import { Abso } from "abso-ai"
const abso = new Abso({
openai: { apiKey: "your-openai-key" },
anthropic: { apiKey: "your-anthropic-key" },
// add other providers as needed
})
const result = await abso.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
})
console.log(result.choices[0].message.content)
```
Alternatively, you can also change the providers that are loaded by passing a custom `providers` array to the constructor.
## Observability
You can use Abso with [Lunary](https://lunary.ai) to get instant observability into your LLM usage.
First signup to [Lunary](https://lunary.ai) and get your public key.
Then simply set the `LUNARY_PUBLIC_KEY` environment variable to your public key to enable observability.
## Contributing
See our [Contributing Guide](CONTRIBUTING.md).
## Roadmap
- [ ] More providers
- [ ] Built in caching
- [ ] Tokenizers
- [ ] Cost calculation
- [ ] Smart routing