https://github.com/sertrafurr/duckduckai
Python API Wrapper to interact with DuckDuckAI
https://github.com/sertrafurr/duckduckai
ai ai-wrapper api claude claude-ai duckduckgo free gpt gpt-4o llama meta-llama mistral mistral-7b package pypi python
Last synced: 2 days ago
JSON representation
Python API Wrapper to interact with DuckDuckAI
- Host: GitHub
- URL: https://github.com/sertrafurr/duckduckai
- Owner: SertraFurr
- License: apache-2.0
- Created: 2025-01-24T09:20:25.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2025-04-03T13:33:29.000Z (6 months ago)
- Last Synced: 2025-06-29T21:44:14.004Z (3 months ago)
- Topics: ai, ai-wrapper, api, claude, claude-ai, duckduckgo, free, gpt, gpt-4o, llama, meta-llama, mistral, mistral-7b, package, pypi, python
- Language: Python
- Homepage:
- Size: 30.3 KB
- Stars: 5
- Watchers: 1
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# DuckDuckAI
DuckDuckAI is a Python package for interacting with DuckDuckGo's chat API. It allows you to fetch responses from DuckDuckGo's AI models and print them in a streamed format or as a complete response.
## Installation
To install the DuckDuckAI package, you can use pip:
```bash
pip install DuckDuckAI
```## Usage
You can interact with DuckDuckAI by calling the `ask` function. It supports both streaming responses or returning the entire message at once.
### Example
```python
from duckduckai import ask# Fetch response in streamed format (printing character by character)
ask("Tell me a joke", stream=True)[0] # Use 0 if you want to retrieve the response only (No Cookies)# Fetch response as a complete message
response = ask("Tell me a joke", stream=False)
print(response)
```### Parameters
| Parameter | Type | Description | Default |
|-----------|-------|---------------------------------------------------------------------|---------------|
| query | str | The search query string. | Required |
| stream | bool | Whether to stream results or fetch them all at once. | True |
| model | str | The model to use for the response. | gpt-4o-mini |## Available Models
DuckDuckAI currently supports the following models:
| Model ID | Description |
|----------|-------------|
| `gpt-4o-mini` | A smaller variant of GPT-4o designed for quick, concise responses with less computation. |
| `meta-llama/Llama-3.3-70B-Instruct-Turbo` | Meta's large-scale Llama 3.3 model with 70 billion parameters designed for fast and accurate responses. |
| `claude-3-haiku-20240307` | Anthropic's Claude 3 Haiku model optimized for efficient, high-quality responses. |
| `mistralai/Mistral-Small-24B-Instruct-2501` | Mistral AI's 24 billion parameter model trained for instruction-based tasks. |
| `o3-mini` | OpenAI's compact reasoning model optimized for lightweight performance. |Additional models may be available but subject to access restrictions. Some models may require specific permissions or may not be available in all regions.
## Advanced Usage
You can reuse the authentication token to make multiple requests more efficiently:
```python
from duckduckai import ask, fetch_x_vqd_token# Fetch a token once
token = fetch_x_vqd_token()# Use the same token for multiple requests
response1 = ask("What is quantum computing?", model="gpt-4o-mini", token=token)[0] # Do not put [0] if you want the token in the response
response2 = ask("Explain neural networks", model="claude-3-haiku-20240307", token=token)[0] # Do not put [0] if you want the token in the response
```## License
This project is licensed under the Apache-2.0 license - see the LICENSE file for details.