https://github.com/superusernameman/php-ollama-chat-client
Very basic command line Ollama chat client written in simple PHP8.
https://github.com/superusernameman/php-ollama-chat-client
llm ollama ollama-client php php-cli php8
Last synced: 4 months ago
JSON representation
Very basic command line Ollama chat client written in simple PHP8.
- Host: GitHub
- URL: https://github.com/superusernameman/php-ollama-chat-client
- Owner: SuperUserNameMan
- License: unlicense
- Created: 2025-06-13T14:48:16.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2025-06-17T12:22:42.000Z (9 months ago)
- Last Synced: 2025-06-27T21:37:24.384Z (9 months ago)
- Topics: llm, ollama, ollama-client, php, php-cli, php8
- Language: PHP
- Homepage:
- Size: 52.7 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# php-ollama-chat-client
Very basic command line Ollama chat client written in simple PHP8.
## How ?
`php -e chat.php`
`php -e tool_test.php` (the LLM has access to a tool function `get_datetime`.)
### Requirements :
- Ollama
- PHP 8 CLI
- curl
- readline
- json
## Why ?
- base-code for quick experiments ;
- base-code for prototyping function calling agents ;
- solves my python allergies ;
## Warnings :
- only tested on Linux MATE Terminal
- uses terminal color codes (duno how it behaves on non compatible terminals)
- tools function calling does not work with all models that prentend to have the `tools` capabitily (on all model i tested, only `qwen3` and `llama3.2` are able to call tools, and sometimes `granite3.3`)