An open API service indexing awesome lists of open source software.

https://github.com/withceleste/celeste-python

Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface 🌟
https://github.com/withceleste/celeste-python

ai async developer-tools multimodal python sdk streaming type-safe

Last synced: 3 months ago
JSON representation

Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface 🌟

Awesome Lists containing this project

README

          

# Celeste AI

Celeste Logo

**The primitive layer for multi-modal AI**

All modalities. All providers. One interface.

Primitives, not frameworks.

[![Python](https://img.shields.io/badge/Python-3.12+-blue?style=for-the-badge)](https://www.python.org/)
[![License](https://img.shields.io/badge/License-MIT-yellow?style=for-the-badge)](LICENSE)
[![PyPI](https://img.shields.io/pypi/v/celeste-ai?style=for-the-badge)](https://pypi.org/project/celeste-ai/)








Follow @withceleste on LinkedIn

[Quick Start](#-quick-start) β€’ [Request Provider](https://github.com/withceleste/celeste-python/issues/new)

> πŸš€ This is the v1 Beta release. We're validating the new architecture before the stable v1.0 release. Feedback welcome!

# Celeste AI

Type-safe, modality/provider-agnostic primitives.

- **Unified Interface:** One API for OpenAI, Anthropic, Gemini, Mistral, and 14+ others.
- **True Multi-Modal:** Text, Image, Audio, Video, Embeddings, Search β€”all first-class citizens.
- **Type-Safe by Design:** Full Pydantic validation and IDE autocomplete.
- **Zero Lock-In:** Switch providers instantly by changing a single config string.
- **Primitives, Not Frameworks:** No agents, no chains, no magic. Just clean I/O.
- **Lightweight Architecture:** No vendor SDKs. Pure, fast HTTP.

## πŸš€ Quick Start
```python
import celeste

# One SDK. Every modality. Any provider.
text = await celeste.text.generate("Explain quantum computing", model="claude-opus-4-5")
image = await celeste.images.generate("A serene mountain lake at dawn", model="flux-2-pro")
speech = await celeste.audio.speak("Welcome to the future", model="eleven_v3")
video = await celeste.videos.analyze(video_file, prompt="Summarize this clip", model="gemini-3-pro")
embeddings = await celeste.text.embed(["lorep ipsum", "dolor sit amet"], model="gemini-embedding-001")
```

---

## 15+ providers. Zero lock-in.

Google
OpenAI
Mistral
Anthropic
Cohere
xAI
DeepSeek
Ollama
Groq
ElevenLabs
BytePlus
Black Forest Labs

**and many more**

**Missing a provider?** [Request it](https://github.com/withceleste/celeste-python/issues/new) – ⚑ **we ship fast**.

---

## Operations by Domain

| Action | Text | Images | Audio | Video |
| :--- | :---: | :---: | :---: | :---: |
| **Generate** | βœ“ | βœ“ | β—‹ | βœ“ |
| **Edit** | β€” | βœ“ | β€” | β€” |
| **Analyze** | β€” | βœ“ | βœ“ | βœ“ |
| **Upscale** | β€” | β—‹ | β€” | β—‹ |
| **Speak** | β€” | β€” | βœ“ | β€” |
| **Transcribe** | β€” | β€” | βœ“ | β€” |
| **Embed** | βœ“ | β—‹ | β€” | β—‹ |

βœ“ Available Β· β—‹ Planned

## πŸ”„ Switch providers in one line

```python
from pydantic import BaseModel

class User(BaseModel):
name: str
age: int

# Model IDs
anthropic_model_id = "claude-4-5-sonnet"
google_model_id = "gemini-2.5-flash"
```

```python
# ❌ Anthropic Way
from anthropic import Anthropic
import json

client = Anthropic()
response = client.messages.create(
model=anthropic_model_id,
messages=[
{"role": "user",
"content": "Extract user info: John is 30"}
],
output_format={
"type": "json_schema",
"schema": User.model_json_schema()
}
)
user_data = json.loads(response.content[0].text)
```

```python
# ❌ Google Gemini Way
from google import genai
from google.genai import types

client = genai.Client()
response = await client.aio.models.generate_content(
model=gemini_model_id,
contents="Extract user info: John is 30",
config=types.GenerateContentConfig(
response_mime_type="application/json",
response_schema=User
)
)
user = response.parsed
```

```python
# βœ… Celeste Way
import celeste

response = await celeste.text.generate(
"Extract user info: John is 30",
model=google_model_id, # <--- Choose any model from any provider
output_schema=User, # <--- Unified parameter working across all providers
)
user = response.content # Already parsed as User instance
```

## βš™οΈ Advanced: Create Client
For explicit configuration or client reuse, use `create_client` with modality + operation. This is modality-first: you choose the output type and operation explicitly.

```python
from celeste import create_client, Modality, Operation, Provider

client = create_client(
modality=Modality.TEXT,
operation=Operation.GENERATE,
provider=Provider.OLLAMA,
model="llama3.2",
)
response = await client.generate("Extract user info: John is 30", output_schema=User)
```

> `capability` is still supported but deprecated. Prefer `modality` + `operation`.

---
## πŸͺΆ Install
```bash
uv add celeste-ai
# or
pip install celeste-ai
```

---
## πŸ”§ Type-Safe by Design

```python
# Full IDE autocomplete
import celeste

response = await celeste.text.generate(
"Explain AI",
model="gpt-4o-mini",
temperature=0.7, # βœ… Validated (0.0-2.0)
max_tokens=100, # βœ… Validated (int)
)

# Typed response
print(response.content) # str (IDE knows the type)
print(response.usage.input_tokens) # int
print(response.metadata["model"]) # str
```

Catch errors **before** production.

---

## 🀝 Contributing

We welcome contributions! See [CONTRIBUTING.md](CONTRIBUTING.md).

**Request a provider:** [GitHub Issues](https://github.com/withceleste/celeste-python/issues/new)
**Report bugs:** [GitHub Issues](https://github.com/withceleste/celeste-python/issues)

---

## πŸ“„ License

MIT license – see [LICENSE](LICENSE) for details.

**[Get Started](https://withceleste.ai/docs/quickstart)** β€’ **[Documentation](https://withceleste.ai/docs)** β€’ **[GitHub](https://github.com/withceleste/celeste-python)**

Made with ❀️ by developers tired of framework lock-in