Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lunary-ai/abso
TypeScript SDK to call 100+ LLM Providers in OpenAI format.
https://github.com/lunary-ai/abso
ai anthropic azure-openai bedrock gemini grok groq groq-ai llm mistral ollama openai openrouter voyage xai
Last synced: 5 days ago
JSON representation
TypeScript SDK to call 100+ LLM Providers in OpenAI format.
- Host: GitHub
- URL: https://github.com/lunary-ai/abso
- Owner: lunary-ai
- License: mit
- Created: 2025-02-08T12:27:17.000Z (13 days ago)
- Default Branch: main
- Last Pushed: 2025-02-15T16:49:30.000Z (6 days ago)
- Last Synced: 2025-02-15T16:56:10.216Z (6 days ago)
- Topics: ai, anthropic, azure-openai, bedrock, gemini, grok, groq, groq-ai, llm, mistral, ollama, openai, openrouter, voyage, xai
- Language: TypeScript
- Homepage: https://abso.ai
- Size: 147 KB
- Stars: 14
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- trackawesomelist - lunary-ai/abso (⭐16)
README
![]()
**TypeScript LLM client**
[](https://badge.fury.io/js/abso-ai)  
**Abso** provides a unified interface for calling various LLMs while maintaining full type safety.
## Features
- **OpenAI-compatible API 🔁**
- **Lightweight & Fast ⚡**
- **Embeddings support 🧮**
- **Unified tool calling 🛠️**
- **Tokenizer and cost calculation (soon) 🔢** for accurate token counting and cost estimation
- **Smart routing (soon)** to the best model for your request## Providers
| Provider | Chat | Streaming | Tool Calling | Embeddings | Tokenizer | Cost Calculation |
| ---------- | ---- | --------- | ------------ | ---------- | --------- | ---------------- |
| OpenAI | ✅ | ✅ | ✅ | ✅ | 🚧 | 🚧 |
| Anthropic | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| xAI Grok | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| Mistral | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| Groq | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| Ollama | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| OpenRouter | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| Voyage | ❌ | ❌ | ❌ | ✅ | ❌ | ❌ |
| Azure | 🚧 | 🚧 | 🚧 | 🚧 | ❌ | 🚧 |
| Bedrock | 🚧 | 🚧 | 🚧 | 🚧 | ❌ | 🚧 |
| Gemini | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ |
| DeepSeek | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ |## Installation
```bash
npm install abso-ai
```## Usage
```ts
import { abso } from "abso-ai"const result = await abso.chat.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "gpt-4o",
})console.log(result.choices[0].message.content)
```## Manually selecting a provider
Abso tries to infer the best provider for a given model, but you can also manually select a provider.
```ts
const result = await abso.chat.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "openai/gpt-4o",
provider: "openrouter",
})console.log(result.choices[0].message.content)
```## Streaming
```ts
const stream = await abso.chat.stream({
messages: [{ role: "user", content: "Say this is a test" }],
model: "gpt-4o",
})for await (const chunk of stream) {
console.log(chunk)
}// Helper to get the final result
const fullResult = await stream.finalChatCompletion()console.log(fullResult)
```## Embeddings
```ts
const embeddings = await abso.embed({
model: "text-embedding-3-small",
input: ["A cat was playing with a ball on the floor"],
})console.log(embeddings.data[0].embedding)
```## Tokenizers (soon)
```ts
const tokens = await abso.tokenize({
messages: [{ role: "user", content: "Hello, world!" }],
model: "gpt-4o",
})console.log(`${tokens.count} tokens`)
```## Custom Providers
```ts
import { Abso } from "abso"
import { MyCustomProvider } from "./myCustomProvider"const abso = new Abso([])
abso.registerProvider(new MyCustomProvider(/* config */))const result = await abso.chat.create({
model: "my-custom-model",
messages: [{ role: "user", content: "Hello!" }],
})
```## Observability
You can use Abso with [Lunary](https://lunary.ai) to get observability into your LLM usage.
First signup to [Lunary](https://lunary.ai) and get your public key.
Then simply set the `LUNARY_PUBLIC_KEY` environment variable to your public key to enable observability.
## Ollama
```ts
import { abso } from "abso-ai"const result = await abso.chat.create({
messages: [{ role: "user", content: "Hi, what's up?" }],
model: "llama3.2",
provider: "ollama",
})console.log(result.choices[0].message.content)
```## Contributing
See our [Contributing Guide](CONTRIBUTING.md).
## Roadmap
- [ ] More providers
- [ ] Built in caching
- [ ] Tokenizers
- [ ] Cost calculation
- [ ] Smart routing