https://github.com/tusharad/ollama-haskell
Ollama client for Haskell
https://github.com/tusharad/ollama-haskell
Last synced: 5 months ago
JSON representation
Ollama client for Haskell
- Host: GitHub
- URL: https://github.com/tusharad/ollama-haskell
- Owner: tusharad
- License: mit
- Created: 2024-08-20T16:12:02.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-06-10T09:15:16.000Z (6 months ago)
- Last Synced: 2025-07-22T14:57:38.944Z (5 months ago)
- Language: Haskell
- Homepage: https://hackage.haskell.org/package/ollama-haskell
- Size: 5.3 MB
- Stars: 42
- Watchers: 5
- Forks: 4
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# ๐ฆ Ollama Haskell
**`ollama-haskell`** is an unofficial Haskell client for [Ollama](https://ollama.com), inspired by [`ollama-python`](https://github.com/ollama/ollama-python). It enables interaction with locally running LLMs through the Ollama HTTP API โ directly from Haskell.
---
## โจ Features
* ๐ฌ Chat with models
* โ๏ธ Text generation (with streaming)
* โ
Chat with structured messages and tools
* ๐ง Embeddings
* ๐งฐ Model management (list, pull, push, show, delete)
* ๐๏ธ In-memory conversation history
* โ๏ธ Configurable timeouts, retries, streaming handlers
---
## โก Quick Example
```haskell
{-# LANGUAGE OverloadedStrings #-}
module Main where
import Data.Ollama.Generate
import qualified Data.Text.IO as T
main :: IO ()
main = do
let ops =
defaultGenerateOps
{ modelName = "gemma3"
, prompt = "What is the meaning of life?"
}
eRes <- generate ops Nothing
case eRes of
Left err -> putStrLn $ "Something went wrong: " ++ show err
Right r -> do
putStr "LLM response: "
T.putStrLn (genResponse r)
```
---
## ๐ฆ Installation
Add to your `.cabal` file:
```cabal
build-depends:
base >=4.7 && <5,
ollama-haskell
```
Or use with `stack`/`nix-shell`.
---
## ๐ More Examples
See [`examples/OllamaExamples.hs`](examples/OllamaExamples.hs) for:
* Chat with conversation memory
* Structured JSON output
* Embeddings
* Tool/function calling
* Multimodal input
* Streaming and non-streaming variants
---
## ๐ Prerequisite
Make sure you have [Ollama installed and running locally](https://ollama.com/download). Run `ollama pull llama3` to download a model.
---
## ๐งช Dev & Nix Support
Use Nix:
```bash
nix-shell
```
This will install `stack` and Ollama.
---
## ๐จโ๐ป Author
Created and maintained by [@tusharad](https://github.com/tusharad). PRs and feedback are welcome!
---
## ๐ค Contributing
Have ideas or improvements? Feel free to [open an issue](https://github.com/tusharad/ollama-haskell/issues) or submit a PR!