https://github.com/lzjever/lexilux
Unified LLM API client library for Python. Simple API for Chat, Embedding, Rerank, and Tokenizer. OpenAI-compatible with streaming support and unified usage tracking.
https://github.com/lzjever/lexilux
api-client chat-api document-ranking embedding function-api llm openai-api openai-compatible python rerank reranker semantic-search streaming tokenizer
Last synced: about 1 month ago
JSON representation
Unified LLM API client library for Python. Simple API for Chat, Embedding, Rerank, and Tokenizer. OpenAI-compatible with streaming support and unified usage tracking.
- Host: GitHub
- URL: https://github.com/lzjever/lexilux
- Owner: lzjever
- License: apache-2.0
- Created: 2025-12-29T05:30:20.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2026-01-24T19:01:10.000Z (about 1 month ago)
- Last Synced: 2026-01-24T21:35:17.506Z (about 1 month ago)
- Topics: api-client, chat-api, document-ranking, embedding, function-api, llm, openai-api, openai-compatible, python, rerank, reranker, semantic-search, streaming, tokenizer
- Language: Python
- Homepage: https://lexilux.readthedocs.io
- Size: 1.5 MB
- Stars: 41
- Watchers: 23
- Forks: 18
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
README
# Lexilux
[](https://pypi.org/project/lexilux/)
[](https://www.python.org/downloads/)
[](LICENSE)
[](https://lexilux.readthedocs.io)
[](https://github.com/lzjever/lexilux/actions)
[](https://codecov.io/gh/lzjever/lexilux)
**Lexilux** is a unified LLM API client library that makes calling Chat, Embedding, Rerank, and Tokenizer APIs as simple as calling a function.
## Features
- **Function-like API**: Call APIs like functions (`chat("hi")`, `embed(["text"])`)
- **Streaming Support**: Built-in streaming for Chat with usage tracking
- **Unified Usage**: Consistent usage statistics across all APIs
- **Flexible Input**: Support multiple input formats (string, list, dict)
- **OpenAI-Compatible**: Works with OpenAI-compatible APIs
- **Automatic Retry**: Built-in retry logic with exponential backoff
- **Connection Pooling**: HTTP connection pooling for better performance
- **Exception Hierarchy**: Comprehensive exception system with error codes
- **Function Calling**: OpenAI-compatible function/tool calling support
- **Multimodal Support**: Vision capabilities with image inputs
- **Async Support**: Full async/await API for concurrent operations
## Installation
### Quick Install
```bash
pip install lexilux
```
### With Tokenizer Support
```bash
pip install lexilux[tokenizer]
```
### Development Setup with uv (Recommended)
This project uses [uv](https://github.com/astral-sh/uv) for fast dependency management.
```bash
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# For active development
make dev-install
# Or manually with uv
uv sync --group docs --all-extras
```
### Legacy pip Setup
```bash
pip install -e ".[dev]"
```
## Quick Start
### Basic Chat
```python
from lexilux import Chat
chat = Chat(base_url="https://api.example.com/v1", api_key="your-key", model="gpt-4")
result = chat("Hello, world!")
print(result.text)
print(result.usage.total_tokens)
```
### Streaming
```python
for chunk in chat.stream("Tell me a joke"):
print(chunk.delta, end="", flush=True)
if chunk.done:
print(f"\nTokens: {chunk.usage.total_tokens}")
```
### Error Handling
```python
from lexilux import LexiluxError, AuthenticationError, RateLimitError
try:
result = chat("Hello, world!")
except AuthenticationError as e:
print(f"Authentication failed: {e.message}")
except RateLimitError as e:
if e.retryable:
print(f"Rate limited: {e.message}")
except LexiluxError as e:
print(f"Error: {e.code} - {e.message}")
```
### Function Calling
```python
from lexilux import Chat, FunctionTool
get_weather = FunctionTool(
name="get_weather",
description="Get weather for a location",
parameters={
"type": "object",
"properties": {
"location": {"type": "string", "description": "City name"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
)
result = chat("What's the weather in Paris?", tools=[get_weather])
if result.has_tool_calls:
for tool_call in result.tool_calls:
print(f"Calling: {tool_call.name}")
print(f"Arguments: {tool_call.get_arguments()}")
```
### Async
```python
import asyncio
from lexilux import Chat
async def main():
chat = Chat(base_url="...", api_key="...", model="gpt-4")
result = await chat.a("Hello, async world!")
print(result.text)
asyncio.run(main())
```
## Documentation
Full documentation available at: [lexilux.readthedocs.io](https://lexilux.readthedocs.io)
## Examples
Check out the `examples/` directory for practical examples:
- `examples/01_hello_world.py` - Basic chat completion
- `examples/02_system_message.py` - Using system messages
- `examples/10_streaming.py` - Streaming chat
- `examples/11_conversation.py` - Multi-turn conversations
- `examples/12_chat_params.py` - Custom chat parameters
- `examples/20_embedding.py` - Text embedding
- `examples/21_rerank.py` - Document reranking
- `examples/22_tokenizer.py` - Tokenization
- `examples/30_function_calling.py` - Function calling
- `examples/31_multimodal.py` - Vision capabilities
- `examples/32_async.py` - Async operations
- `examples/40_chat_history.py` - History management
- `examples/41_auto_continue.py` - Continue cut-off responses
- `examples/42_error_handling.py` - Error handling patterns
- `examples/43_custom_formatting.py` - Custom response formatting
Run examples:
```bash
python examples/01_hello_world.py
```
## Testing
```bash
# Run unit tests
make test
# Run integration tests
make test-integration
# Run with coverage
make test-cov
# Run linting
make lint
# Format code
make format
```
Build documentation locally:
```bash
cd docs && make html
```
## About Agentsmith
**Lexilux** is part of the **Agentsmith** open-source ecosystem. Agentsmith is a ToB AI agent and algorithm development platform, currently deployed in multiple highway management companies, securities firms, and regulatory agencies in China. The Agentsmith team is gradually open-sourcing the platform by removing proprietary code and algorithm modules, as well as enterprise-specific customizations, while decoupling the system for modular use by the open-source community.
### Agentsmith Open-Source Projects
- **[Varlord](https://github.com/lzjever/varlord)** - Configuration management library
- **[Routilux](https://github.com/lzjever/routilux)** - Event-driven workflow orchestration
- **[Serilux](https://github.com/lzjever/serilux)** - Flexible serialization framework
- **[Lexilux](https://github.com/lzjever/lexilux)** - Unified LLM API client library
## License
Lexilux is licensed under the **Apache License 2.0**. See [LICENSE](LICENSE) for details.
## Links
- **PyPI**: [pypi.org/project/lexilux](https://pypi.org/project/lexilux)
- **Documentation**: [lexilux.readthedocs.io](https://lexilux.readthedocs.io)
- **GitHub**: [github.com/lzjever/lexilux](https://github.com/lzjever/lexilux)