{"id":24713944,"url":"https://github.com/syntaxerror4life/ollamatools","last_synced_at":"2025-07-27T18:14:43.606Z","repository":{"id":274370701,"uuid":"922277323","full_name":"SyntaxError4Life/OllamaTools","owner":"SyntaxError4Life","description":"Use tools with ollama python","archived":false,"fork":false,"pushed_at":"2025-05-30T15:55:41.000Z","size":22,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-05-30T22:20:15.218Z","etag":null,"topics":["agentic-ai","diy-ai","function-call","local-ai","ollama","ollama-client","python"],"latest_commit_sha":null,"homepage":"https://Zanomega.com","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/SyntaxError4Life.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-01-25T19:25:42.000Z","updated_at":"2025-05-30T15:55:46.000Z","dependencies_parsed_at":null,"dependency_job_id":"18fb4b2d-3e4a-4951-99e8-3c6e1aa1cf9b","html_url":"https://github.com/SyntaxError4Life/OllamaTools","commit_stats":null,"previous_names":["syntaxerror4life/ollamatools"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/SyntaxError4Life/OllamaTools","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SyntaxError4Life%2FOllamaTools","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SyntaxError4Life%2FOllamaTools/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SyntaxError4Life%2FOllamaTools/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SyntaxError4Life%2FOllamaTools/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/SyntaxError4Life","download_url":"https://codeload.github.com/SyntaxError4Life/OllamaTools/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SyntaxError4Life%2FOllamaTools/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":267400732,"owners_count":24081188,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-07-27T02:00:11.917Z","response_time":82,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["agentic-ai","diy-ai","function-call","local-ai","ollama","ollama-client","python"],"created_at":"2025-01-27T08:14:12.626Z","updated_at":"2025-07-27T18:14:43.588Z","avatar_url":"https://github.com/SyntaxError4Life.png","language":"Python","readme":"# Documentation: Using Tools with Ollama\n\nThis documentation explains how to use function calls (tools) with Ollama, based on models like Llama3.1 and Mistral. It describes the format of messages, tools, and responses.\n\n---\n\n## **1. Tool Structure**\n\nA tool is a function that the model can call to obtain information or perform actions. Here is the JSON format of a tool:\n\n```json\n{\n    \"type\": \"function\",\n    \"function\": {\n        \"name\": \"function_name\",\n        \"description\": \"Description of what the function does\",\n        \"parameters\": {\n            \"type\": \"object\",\n            \"properties\": {\n                \"param1\": {\"type\": \"string\", \"description\": \"Description of the parameter\"},\n                \"param2\": {\"type\": \"number\", \"description\": \"Description of the parameter\"}\n            },\n            \"required\": [\"param1\"]  // Required parameters\n        }\n    }\n}\n```\n\n### Example for a `get_current_time` function:\n```json\n{\n    \"type\": \"function\",\n    \"function\": {\n        \"name\": \"get_current_time\",\n        \"description\": \"Get the current time\",\n        \"parameters\": {\"type\": \"object\", \"properties\": {}}\n    }\n}\n```\n\n---\n\n## **2. Message Format**\n\nMessages are structured in JSON and must follow a specific order to handle function calls.\n\n### Available roles:\n- **`system`**: Global instructions for the model.\n- **`user`**: User message.\n- **`assistant`**: Model response **OR** function call.\n- **`tool`**: Tool response after execution.\n\n---\n\n### **2.1. Function Call by the Model**\n\nWhen the model decides to call a function, it returns a message with the `assistant` role and a `tool_calls` field:\n\n```json\n{\n    \"role\": \"assistant\",\n    \"content\": \"\",  // Empty during a function call\n    \"tool_calls\": [\n        {\n            \"function\": {\n                \"name\": \"function_name\",\n                \"arguments\": \"{}\"  // Arguments in JSON format\n            }\n        }\n    ]\n}\n```\n\n#### Example:\n```json\n{\n    \"role\": \"assistant\",\n    \"content\": \"\",\n    \"tool_calls\": [\n        {\n            \"function\": {\n                \"name\": \"get_current_time\",\n                \"arguments\": \"{}\"\n            }\n        }\n    ]\n}\n```\n\n---\n\n### **2.2. Tool Response**\n\nAfter executing the function, you must add a message with the `tool` role to provide the result to the model:\n\n```json\n{\n    \"role\": \"tool\",\n    \"name\": \"function_name\",\n    \"content\": \"Function result\"\n}\n```\n\n#### Example:\n```json\n{\n    \"role\": \"tool\",\n    \"name\": \"get_current_time\",\n    \"content\": \"15:30:45\"\n}\n```\n\n---\n\n### **2.3. Final Model Response**\n\nThe model uses the tool result to generate a final response:\n\n```json\n{\n    \"role\": \"assistant\",\n    \"content\": \"Final response based on the tool result\"\n}\n```\n\n#### Example:\n```json\n{\n    \"role\": \"assistant\",\n    \"content\": \"It is currently 15 hours, 30 minutes, and 45 seconds.\"\n}\n```\n\n---\n\n## **3. Complete Conversation Flow**\n\nHere is an example of a complete flow for requesting the current time:\n\n### **Step 1: User Message**\n```json\n{\n    \"role\": \"user\",\n    \"content\": \"What time is it?\"\n}\n```\n\n### **Step 2: Function Call by the Model**\n```json\n{\n    \"role\": \"assistant\",\n    \"content\": \"\",\n    \"tool_calls\": [\n        {\n            \"function\": {\n                \"name\": \"get_current_time\",\n                \"arguments\": \"{}\"\n            }\n        }\n    ]\n}\n```\n\n### **Step 3: Tool Response**\n```json\n{\n    \"role\": \"tool\",\n    \"name\": \"get_current_time\",\n    \"content\": \"15:30:45\"\n}\n```\n\n### **Step 4: Final Model Response**\n```json\n{\n    \"role\": \"assistant\",\n    \"content\": \"It is currently 15 hours, 30 minutes, and 45 seconds.\"\n}\n```\n\n---\n\n## **4. Python Code Example**\n\nHere is an example of a complete program to handle a conversation with a function call:\n\n```python\nfrom ollama import chat\nimport datetime\n\n# Tool definition\ntools = [\n    {\n        \"type\": \"function\",\n        \"function\": {\n            \"name\": \"get_current_time\",\n            \"description\": \"Get the current time\",\n            \"parameters\": {\"type\": \"object\", \"properties\": {}}\n        }\n    }\n]\n\n# Initial conversation\nmessages = [\n    {\"role\": \"system\", \"content\": \"Use tools when asked for the time.\"},\n    {\"role\": \"user\", \"content\": \"What time is it?\"}\n]\n\n# Initial call\nresponse = chat(\n    model=\"llama3.1\", # Works with mistra (and is better)\n    messages=messages,\n    tools=tools\n)\n\ndef get_current_time():\n    return datetime.datetime.now().strftime(\"%H:%M:%S\")\n\n# If a function call is detected\nif hasattr(response.message, 'tool_calls') and response.message.tool_calls:\n    # Add the assistant's response to the context\n    messages.append({\"role\": \"assistant\", \"content\": \"\", \"tool_calls\": response.message.tool_calls})\n    \n    # Fictional tool response\n    tool_response = {\n        \"role\": \"tool\",\n        \"name\": \"get_current_time\",\n        \"content\": f\"{get_current_time()}\"\n    }\n    messages.append(tool_response)\n    \n    # Final response\n    final_response = chat(\n        model=\"llama3.1\",\n        messages=messages\n    )\n    print(\"Final response:\", final_response.message.content)\nelse:\n    print(\"No function call detected.\")\n```\n\n---\n\n## **5. Best Practices**\n\n1. **Clear Instructions**: Use a `system` message to guide the model.\n2. **Argument Validation**: Always validate arguments before executing a function.\n3. **Error Handling**: Add checks for cases where the model does not call a function.\n4. **Response Format**: Ensure tool responses are well-structured.\n\n---\n---\n\n## **Repository Evolution**\n\nThis repository is constantly evolving.\n\n- **Additional Examples**: Adding practical cases with more complex tools.\n- **Advanced Workflows**: Examples of nested workflows with multiple tool calls.\n- **Integrations**: Examples of integration with other libraries or services (e.g., external APIs).\n\n## **License**\nThis repository is public documentation. You are free to use it as you see fit.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsyntaxerror4life%2Follamatools","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fsyntaxerror4life%2Follamatools","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsyntaxerror4life%2Follamatools/lists"}