https://github.com/withceleste/celeste-python
Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface π
https://github.com/withceleste/celeste-python
ai async developer-tools multimodal python sdk streaming type-safe
Last synced: 3 months ago
JSON representation
Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface π
- Host: GitHub
- URL: https://github.com/withceleste/celeste-python
- Owner: withceleste
- License: mit
- Created: 2025-10-28T16:58:00.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2026-01-23T17:34:20.000Z (4 months ago)
- Last Synced: 2026-01-24T07:47:39.143Z (4 months ago)
- Topics: ai, async, developer-tools, multimodal, python, sdk, streaming, type-safe
- Language: Python
- Homepage: https://withceleste.ai/
- Size: 794 KB
- Stars: 134
- Watchers: 3
- Forks: 12
- Open Issues: 8
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG_V1.md
- License: LICENSE
Awesome Lists containing this project
README
# Celeste AI

**The primitive layer for multi-modal AI**
All modalities. All providers. One interface.
Primitives, not frameworks.
[](https://www.python.org/)
[](LICENSE)
[](https://pypi.org/project/celeste-ai/)
[Quick Start](#-quick-start) β’ [Request Provider](https://github.com/withceleste/celeste-python/issues/new)
> π This is the v1 Beta release. We're validating the new architecture before the stable v1.0 release. Feedback welcome!
# Celeste AI
Type-safe, modality/provider-agnostic primitives.
- **Unified Interface:** One API for OpenAI, Anthropic, Gemini, Mistral, and 14+ others.
- **True Multi-Modal:** Text, Image, Audio, Video, Embeddings, Search βall first-class citizens.
- **Type-Safe by Design:** Full Pydantic validation and IDE autocomplete.
- **Zero Lock-In:** Switch providers instantly by changing a single config string.
- **Primitives, Not Frameworks:** No agents, no chains, no magic. Just clean I/O.
- **Lightweight Architecture:** No vendor SDKs. Pure, fast HTTP.
## π Quick Start
```python
import celeste
# One SDK. Every modality. Any provider.
text = await celeste.text.generate("Explain quantum computing", model="claude-opus-4-5")
image = await celeste.images.generate("A serene mountain lake at dawn", model="flux-2-pro")
speech = await celeste.audio.speak("Welcome to the future", model="eleven_v3")
video = await celeste.videos.analyze(video_file, prompt="Summarize this clip", model="gemini-3-pro")
embeddings = await celeste.text.embed(["lorep ipsum", "dolor sit amet"], model="gemini-embedding-001")
```
---
## 15+ providers. Zero lock-in.

**and many more**
**Missing a provider?** [Request it](https://github.com/withceleste/celeste-python/issues/new) β β‘ **we ship fast**.
---
## Operations by Domain
| Action | Text | Images | Audio | Video |
| :--- | :---: | :---: | :---: | :---: |
| **Generate** | β | β | β | β |
| **Edit** | β | β | β | β |
| **Analyze** | β | β | β | β |
| **Upscale** | β | β | β | β |
| **Speak** | β | β | β | β |
| **Transcribe** | β | β | β | β |
| **Embed** | β | β | β | β |
β Available Β· β Planned
## π Switch providers in one line
```python
from pydantic import BaseModel
class User(BaseModel):
name: str
age: int
# Model IDs
anthropic_model_id = "claude-4-5-sonnet"
google_model_id = "gemini-2.5-flash"
```
```python
# β Anthropic Way
from anthropic import Anthropic
import json
client = Anthropic()
response = client.messages.create(
model=anthropic_model_id,
messages=[
{"role": "user",
"content": "Extract user info: John is 30"}
],
output_format={
"type": "json_schema",
"schema": User.model_json_schema()
}
)
user_data = json.loads(response.content[0].text)
```
```python
# β Google Gemini Way
from google import genai
from google.genai import types
client = genai.Client()
response = await client.aio.models.generate_content(
model=gemini_model_id,
contents="Extract user info: John is 30",
config=types.GenerateContentConfig(
response_mime_type="application/json",
response_schema=User
)
)
user = response.parsed
```
```python
# β
Celeste Way
import celeste
response = await celeste.text.generate(
"Extract user info: John is 30",
model=google_model_id, # <--- Choose any model from any provider
output_schema=User, # <--- Unified parameter working across all providers
)
user = response.content # Already parsed as User instance
```
## βοΈ Advanced: Create Client
For explicit configuration or client reuse, use `create_client` with modality + operation. This is modality-first: you choose the output type and operation explicitly.
```python
from celeste import create_client, Modality, Operation, Provider
client = create_client(
modality=Modality.TEXT,
operation=Operation.GENERATE,
provider=Provider.OLLAMA,
model="llama3.2",
)
response = await client.generate("Extract user info: John is 30", output_schema=User)
```
> `capability` is still supported but deprecated. Prefer `modality` + `operation`.
---
## πͺΆ Install
```bash
uv add celeste-ai
# or
pip install celeste-ai
```
---
## π§ Type-Safe by Design
```python
# Full IDE autocomplete
import celeste
response = await celeste.text.generate(
"Explain AI",
model="gpt-4o-mini",
temperature=0.7, # β
Validated (0.0-2.0)
max_tokens=100, # β
Validated (int)
)
# Typed response
print(response.content) # str (IDE knows the type)
print(response.usage.input_tokens) # int
print(response.metadata["model"]) # str
```
Catch errors **before** production.
---
## π€ Contributing
We welcome contributions! See [CONTRIBUTING.md](CONTRIBUTING.md).
**Request a provider:** [GitHub Issues](https://github.com/withceleste/celeste-python/issues/new)
**Report bugs:** [GitHub Issues](https://github.com/withceleste/celeste-python/issues)
---
## π License
MIT license β see [LICENSE](LICENSE) for details.
**[Get Started](https://withceleste.ai/docs/quickstart)** β’ **[Documentation](https://withceleste.ai/docs)** β’ **[GitHub](https://github.com/withceleste/celeste-python)**
Made with β€οΈ by developers tired of framework lock-in