Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/igorbenav/clientai
A unified client for seamless interaction with multiple AI providers.
https://github.com/igorbenav/clientai
ai api api-rest artificial-intelligence language-model llm nlp ollama ollama-client openai openai-api python replicate replicate-api
Last synced: 18 days ago
JSON representation
A unified client for seamless interaction with multiple AI providers.
- Host: GitHub
- URL: https://github.com/igorbenav/clientai
- Owner: igorbenav
- License: mit
- Created: 2024-10-02T00:27:34.000Z (about 1 month ago)
- Default Branch: main
- Last Pushed: 2024-10-28T06:27:55.000Z (19 days ago)
- Last Synced: 2024-10-28T08:52:15.521Z (19 days ago)
- Topics: ai, api, api-rest, artificial-intelligence, language-model, llm, nlp, ollama, ollama-client, openai, openai-api, python, replicate, replicate-api
- Language: Python
- Homepage: https://igorbenav.github.io/clientai/
- Size: 1.06 MB
- Stars: 17
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# ClientAI
A unified client for seamless interaction with multiple AI providers.---
**ClientAI** is a Python package that provides a unified interface for interacting with multiple AI providers, including OpenAI, Replicate, and Ollama. It offers seamless integration and consistent methods for text generation and chat functionality across different AI platforms.
**Documentation**: [igorbenav.github.io/clientai/](https://igorbenav.github.io/clientai/)
---
## Features
- 🔄 **Unified Interface**: Consistent methods for text generation and chat across multiple AI providers.
- 🔌 **Multiple Providers**: Support for OpenAI, Replicate, and Ollama, with easy extensibility for future providers.
- 🌊 **Streaming Support**: Efficient streaming of responses for real-time applications.
- 🎛️ **Flexible Configuration**: Easy setup with provider-specific configurations.
- 🔧 **Customizable**: Extensible design for adding new providers or customizing existing ones.
- 🧠 **Type Hinting**: Comprehensive type annotations for better development experience.
- 🔒 **Provider Isolation**: Optional installation of provider-specific dependencies to keep your environment lean.## Requirements
Before installing ClientAI, ensure you have the following:
- **Python**: Version 3.9 or newer.
- **Dependencies**: The core ClientAI package has minimal dependencies. Provider-specific packages (e.g., `openai`, `replicate`, `ollama`) are optional and can be installed separately.## Installing
To install ClientAI with all providers, run:
```sh
pip install clientai[all]
```Or, if you prefer to install only specific providers:
```sh
pip install clientai[openai] # For OpenAI support
pip install clientai[replicate] # For Replicate support
pip install clientai[ollama] # For Ollama support
```## Usage
ClientAI provides a simple and consistent way to interact with different AI providers. Here are some examples:
### Initializing the Client
```python
from clientai import ClientAI# Initialize with OpenAI
openai_client = ClientAI('openai', api_key="your-openai-key")# Initialize with Replicate
replicate_client = ClientAI('replicate', api_key="your-replicate-key")# Initialize with Ollama
ollama_client = ClientAI('ollama', host="your-ollama-host")
```### Generating Text
```python
# Using OpenAI
response = openai_client.generate_text(
"Tell me a joke",
model="gpt-3.5-turbo",
)# Using Replicate
response = replicate_client.generate_text(
"Explain quantum computing",
model="meta/llama-2-70b-chat:latest",
)# Using Ollama
response = ollama_client.generate_text(
"What is the capital of France?",
model="llama2",
)
```### Chat Functionality
```python
messages = [
{"role": "user", "content": "What is the capital of France?"},
{"role": "assistant", "content": "Paris."},
{"role": "user", "content": "What is its population?"}
]# Using OpenAI
response = openai_client.chat(
messages,
model="gpt-3.5-turbo",
)# Using Replicate
response = replicate_client.chat(
messages,
model="meta/llama-2-70b-chat:latest",
)# Using Ollama
response = ollama_client.chat(
messages,
model="llama2",
)
```### Streaming Responses
```python
for chunk in client.generate_text(
"Tell me a long story",
model="gpt-3.5-turbo",
stream=True
):
print(chunk, end="", flush=True)
```## Contributing
Contributions to ClientAI are welcome! Please refer to our [Contributing Guidelines](CONTRIBUTING.md) for more information.
## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## Contact
Igor Magalhaes – [@igormagalhaesr](https://twitter.com/igormagalhaesr) – [email protected]
[github.com/igorbenav](https://github.com/igorbenav/)