Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ollama/ollama-python
Ollama Python library
https://github.com/ollama/ollama-python
ollama python
Last synced: 2 days ago
JSON representation
Ollama Python library
- Host: GitHub
- URL: https://github.com/ollama/ollama-python
- Owner: ollama
- License: mit
- Created: 2023-12-09T09:27:18.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2024-10-24T22:14:42.000Z (10 days ago)
- Last Synced: 2024-10-29T14:51:12.924Z (5 days ago)
- Topics: ollama, python
- Language: Python
- Homepage: https://ollama.com
- Size: 244 KB
- Stars: 4,334
- Watchers: 35
- Forks: 366
- Open Issues: 100
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- AiTreasureBox - ollama/ollama-python - 11-02_4376_3](https://img.shields.io/github/stars/ollama/ollama-python.svg)|Ollama Python library| (Repos)
- jimsghstars - ollama/ollama-python - Ollama Python library (Python)
README
# Ollama Python Library
The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with [Ollama](https://github.com/ollama/ollama).
## Install
```sh
pip install ollama
```## Usage
```python
import ollama
response = ollama.chat(model='llama3.1', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
```## Streaming responses
Response streaming can be enabled by setting `stream=True`, modifying function calls to return a Python generator where each part is an object in the stream.
```python
import ollamastream = ollama.chat(
model='llama3.1',
messages=[{'role': 'user', 'content': 'Why is the sky blue?'}],
stream=True,
)for chunk in stream:
print(chunk['message']['content'], end='', flush=True)
```## API
The Ollama Python library's API is designed around the [Ollama REST API](https://github.com/ollama/ollama/blob/main/docs/api.md)
### Chat
```python
ollama.chat(model='llama3.1', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])
```### Generate
```python
ollama.generate(model='llama3.1', prompt='Why is the sky blue?')
```### List
```python
ollama.list()
```### Show
```python
ollama.show('llama3.1')
```### Create
```python
modelfile='''
FROM llama3.1
SYSTEM You are mario from super mario bros.
'''ollama.create(model='example', modelfile=modelfile)
```### Copy
```python
ollama.copy('llama3.1', 'user/llama3.1')
```### Delete
```python
ollama.delete('llama3.1')
```### Pull
```python
ollama.pull('llama3.1')
```### Push
```python
ollama.push('user/llama3.1')
```### Embed
```python
ollama.embed(model='llama3.1', input='The sky is blue because of rayleigh scattering')
```### Embed (batch)
```python
ollama.embed(model='llama3.1', input=['The sky is blue because of rayleigh scattering', 'Grass is green because of chlorophyll'])
```### Ps
```python
ollama.ps()
```## Custom client
A custom client can be created with the following fields:
- `host`: The Ollama host to connect to
- `timeout`: The timeout for requests```python
from ollama import Client
client = Client(host='http://localhost:11434')
response = client.chat(model='llama3.1', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
```## Async client
```python
import asyncio
from ollama import AsyncClientasync def chat():
message = {'role': 'user', 'content': 'Why is the sky blue?'}
response = await AsyncClient().chat(model='llama3.1', messages=[message])asyncio.run(chat())
```Setting `stream=True` modifies functions to return a Python asynchronous generator:
```python
import asyncio
from ollama import AsyncClientasync def chat():
message = {'role': 'user', 'content': 'Why is the sky blue?'}
async for part in await AsyncClient().chat(model='llama3.1', messages=[message], stream=True):
print(part['message']['content'], end='', flush=True)asyncio.run(chat())
```## Errors
Errors are raised if requests return an error status or if an error is detected while streaming.
```python
model = 'does-not-yet-exist'try:
ollama.chat(model)
except ollama.ResponseError as e:
print('Error:', e.error)
if e.status_code == 404:
ollama.pull(model)
```