{"id":13584825,"url":"https://github.com/ollama/ollama-python","last_synced_at":"2025-12-13T21:02:34.525Z","repository":{"id":216714634,"uuid":"729453988","full_name":"ollama/ollama-python","owner":"ollama","description":"Ollama Python library","archived":false,"fork":false,"pushed_at":"2025-05-06T22:34:54.000Z","size":434,"stargazers_count":7520,"open_issues_count":100,"forks_count":680,"subscribers_count":53,"default_branch":"main","last_synced_at":"2025-05-12T02:37:33.232Z","etag":null,"topics":["ollama","python"],"latest_commit_sha":null,"homepage":"https://ollama.com","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ollama.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2023-12-09T09:27:18.000Z","updated_at":"2025-05-11T21:32:44.000Z","dependencies_parsed_at":"2024-02-08T20:32:01.926Z","dependency_job_id":"0c43a040-a3d9-4ae8-898b-d15f8a3c73f6","html_url":"https://github.com/ollama/ollama-python","commit_stats":{"total_commits":115,"total_committers":24,"mean_commits":4.791666666666667,"dds":0.5217391304347826,"last_synced_commit":"ebe332b29d5c65aeccfadd4151bf6059ded7049b"},"previous_names":["jmorganca/ollama-python","ollama/ollama-python"],"tags_count":25,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ollama%2Follama-python","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ollama%2Follama-python/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ollama%2Follama-python/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ollama%2Follama-python/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ollama","download_url":"https://codeload.github.com/ollama/ollama-python/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253672614,"owners_count":21945477,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ollama","python"],"created_at":"2024-08-01T15:04:32.699Z","updated_at":"2025-12-13T21:02:34.518Z","avatar_url":"https://github.com/ollama.png","language":"Python","readme":"# Ollama Python Library\n\nThe Ollama Python library provides the easiest way to integrate Python 3.8+ projects with [Ollama](https://github.com/ollama/ollama).\n\n## Prerequisites\n\n- [Ollama](https://ollama.com/download) should be installed and running\n- Pull a model to use with the library: `ollama pull \u003cmodel\u003e` e.g. `ollama pull gemma3`\n  - See [Ollama.com](https://ollama.com/search) for more information on the models available.\n\n## Install\n\n```sh\npip install ollama\n```\n\n## Usage\n\n```python\nfrom ollama import chat\nfrom ollama import ChatResponse\n\nresponse: ChatResponse = chat(model='gemma3', messages=[\n  {\n    'role': 'user',\n    'content': 'Why is the sky blue?',\n  },\n])\nprint(response['message']['content'])\n# or access fields directly from the response object\nprint(response.message.content)\n```\n\nSee [_types.py](ollama/_types.py) for more information on the response types.\n\n## Streaming responses\n\nResponse streaming can be enabled by setting `stream=True`.\n\n```python\nfrom ollama import chat\n\nstream = chat(\n    model='gemma3',\n    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}],\n    stream=True,\n)\n\nfor chunk in stream:\n  print(chunk['message']['content'], end='', flush=True)\n```\n\n## Cloud Models\n\nRun larger models by offloading to Ollama’s cloud while keeping your local workflow.\n\n- Supported models: `deepseek-v3.1:671b-cloud`, `gpt-oss:20b-cloud`, `gpt-oss:120b-cloud`, `kimi-k2:1t-cloud`, `qwen3-coder:480b-cloud`, `kimi-k2-thinking` See [Ollama Models - Cloud](https://ollama.com/search?c=cloud) for more information\n\n### Run via local Ollama\n\n1) Sign in (one-time):\n\n```\nollama signin\n```\n\n2) Pull a cloud model:\n\n```\nollama pull gpt-oss:120b-cloud\n```\n\n3) Make a request:\n\n```python\nfrom ollama import Client\n\nclient = Client()\n\nmessages = [\n  {\n    'role': 'user',\n    'content': 'Why is the sky blue?',\n  },\n]\n\nfor part in client.chat('gpt-oss:120b-cloud', messages=messages, stream=True):\n  print(part.message.content, end='', flush=True)\n```\n\n### Cloud API (ollama.com)\n\nAccess cloud models directly by pointing the client at `https://ollama.com`.\n\n1) Create an API key from [ollama.com](https://ollama.com/settings/keys) , then set:\n\n```\nexport OLLAMA_API_KEY=your_api_key\n```\n\n2) (Optional) List models available via the API:\n\n```\ncurl https://ollama.com/api/tags\n```\n\n3) Generate a response via the cloud API:\n\n```python\nimport os\nfrom ollama import Client\n\nclient = Client(\n    host='https://ollama.com',\n    headers={'Authorization': 'Bearer ' + os.environ.get('OLLAMA_API_KEY')}\n)\n\nmessages = [\n  {\n    'role': 'user',\n    'content': 'Why is the sky blue?',\n  },\n]\n\nfor part in client.chat('gpt-oss:120b', messages=messages, stream=True):\n  print(part.message.content, end='', flush=True)\n```\n\n## Custom client\nA custom client can be created by instantiating `Client` or `AsyncClient` from `ollama`.\n\nAll extra keyword arguments are passed into the [`httpx.Client`](https://www.python-httpx.org/api/#client).\n\n```python\nfrom ollama import Client\nclient = Client(\n  host='http://localhost:11434',\n  headers={'x-some-header': 'some-value'}\n)\nresponse = client.chat(model='gemma3', messages=[\n  {\n    'role': 'user',\n    'content': 'Why is the sky blue?',\n  },\n])\n```\n\n## Async client\n\nThe `AsyncClient` class is used to make asynchronous requests. It can be configured with the same fields as the `Client` class.\n\n```python\nimport asyncio\nfrom ollama import AsyncClient\n\nasync def chat():\n  message = {'role': 'user', 'content': 'Why is the sky blue?'}\n  response = await AsyncClient().chat(model='gemma3', messages=[message])\n\nasyncio.run(chat())\n```\n\nSetting `stream=True` modifies functions to return a Python asynchronous generator:\n\n```python\nimport asyncio\nfrom ollama import AsyncClient\n\nasync def chat():\n  message = {'role': 'user', 'content': 'Why is the sky blue?'}\n  async for part in await AsyncClient().chat(model='gemma3', messages=[message], stream=True):\n    print(part['message']['content'], end='', flush=True)\n\nasyncio.run(chat())\n```\n\n## API\n\nThe Ollama Python library's API is designed around the [Ollama REST API](https://github.com/ollama/ollama/blob/main/docs/api.md)\n\n### Chat\n\n```python\nollama.chat(model='gemma3', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])\n```\n\n### Generate\n\n```python\nollama.generate(model='gemma3', prompt='Why is the sky blue?')\n```\n\n### List\n\n```python\nollama.list()\n```\n\n### Show\n\n```python\nollama.show('gemma3')\n```\n\n### Create\n\n```python\nollama.create(model='example', from_='gemma3', system=\"You are Mario from Super Mario Bros.\")\n```\n\n### Copy\n\n```python\nollama.copy('gemma3', 'user/gemma3')\n```\n\n### Delete\n\n```python\nollama.delete('gemma3')\n```\n\n### Pull\n\n```python\nollama.pull('gemma3')\n```\n\n### Push\n\n```python\nollama.push('user/gemma3')\n```\n\n### Embed\n\n```python\nollama.embed(model='gemma3', input='The sky is blue because of rayleigh scattering')\n```\n\n### Embed (batch)\n\n```python\nollama.embed(model='gemma3', input=['The sky is blue because of rayleigh scattering', 'Grass is green because of chlorophyll'])\n```\n\n### Ps\n\n```python\nollama.ps()\n```\n\n\n## Errors\n\nErrors are raised if requests return an error status or if an error is detected while streaming.\n\n```python\nmodel = 'does-not-yet-exist'\n\ntry:\n  ollama.chat(model)\nexcept ollama.ResponseError as e:\n  print('Error:', e.error)\n  if e.status_code == 404:\n    ollama.pull(model)\n```\n","funding_links":[],"categories":["Python","A01_文本生成_文本对话","SDKs and Libraries","Repos"],"sub_categories":["大语言对话模型及数据"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Follama%2Follama-python","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Follama%2Follama-python","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Follama%2Follama-python/lists"}