https://github.com/xyproto/ollamaclient
Go package and example utilities for using Ollama / LLMs
https://github.com/xyproto/ollamaclient
ai fortune go image-description-generator large-language-models llamacpp llm ollama ollama-client summarize
Last synced: 3 months ago
JSON representation
Go package and example utilities for using Ollama / LLMs
- Host: GitHub
- URL: https://github.com/xyproto/ollamaclient
- Owner: xyproto
- License: apache-2.0
- Created: 2023-10-17T10:34:45.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2025-01-28T15:34:38.000Z (8 months ago)
- Last Synced: 2025-03-29T03:02:19.345Z (6 months ago)
- Topics: ai, fortune, go, image-description-generator, large-language-models, llamacpp, llm, ollama, ollama-client, summarize
- Language: Go
- Homepage:
- Size: 3.05 MB
- Stars: 27
- Watchers: 3
- Forks: 4
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# ollamaclient
[](https://pkg.go.dev/github.com/xyproto/ollamaclient/v2)
A Go package for using Ollama and large language models (LLMs).
### Example use
```go
package mainimport (
"fmt""github.com/xyproto/ollamaclient/v2"
"github.com/xyproto/usermodel"
)func main() {
oc := ollamaclient.New(usermodel.GetTextGenerationModel())
oc.Verbose = true
if err := oc.PullIfNeeded(); err != nil {
fmt.Println("Error:", err)
return
}
prompt := "Write a haiku about the color of cows."
output, err := oc.GetOutput(prompt)
if err != nil {
fmt.Println("Error:", err)
return
}
fmt.Printf("\n%s\n", output)
}
```Example output (with verbosity set to `true`):
```
Sending request to http://localhost:11434/api/tags
Sending request to http://localhost:11434/api/generate: {"model":"gemma2:2b","prompt":"Write a haiku about the color of cows.","options":{"seed":256,"tempera}Brown hides, gentle eyes,
Mooing low in grassy fields,
Milk flows, life's sweet hue.
```Make sure to install and run Ollama locally first, or set `OLLAMA_HOST` to a valid host.
### Using images in the prompt, with the `llava` model
A simple way to describe images:
```go
package mainimport (
"fmt"
"log""github.com/xyproto/ollamaclient/v2"
"github.com/xyproto/usermodel"
)func main() {
model := usermodel.GetVisionModel()
oc := ollamaclient.New(model)
oc.SetReproducible()
if err := oc.PullIfNeeded(true); err != nil {
log.Fatalln(err)
}
imageFilenames := []string{"carrot1.png", "carrot2.png"}
const desiredWordCount = 7
description, err := oc.DescribeImages(imageFilenames, desiredWordCount)
if err != nil {
log.Fatalln(err)
}
fmt.Println(description)
}
```See `v2/cmd/describeimage` for an example that uses a custom prompt.
### Embeddings
* The `.Embeddings` method can be used to pass in a prompt and return a `[]float64`.
### Environment variables
These environment variables are supported:
* `OLLAMA_HOST` (`http://localhost:11434` by default)
* `OLLAMA_MODEL` (uses the model defined by [`llm-manager`](https://github.com/xyproto/llm-manager) by default)
* `OLLAMA_VERBOSE` (`false` by default)### The `summarize` utility
Getting started:
1. Install `ollama` and start it as a service.
2. Install the `summarize` utility: `go install github.com/xyproto/ollamaclient/cmd/summarize@latest`
3. Summarize a README.md file and a source code file: `summarize README.md ollamaclient.go`. This will also download the model if it's the first run.
4. Write a poem about one or more files: `summarize --prompt "Write a poem about the following files:" README.md`Usage:
```bash
./summarize [flags] [ ...]
```Flags:
- `-m`, `--model`: Specify an Ollama model.
- `-o`, `--output`: Define an output file to store the summary.
- `-p`, `--prompt`: Specify a custom prompt header for summary. The default is `Write a short summary of a project that contains the following files:`
- `-w`, `--wrap`: Set the word wrap width. Use -1 to detect the terminal width.
- `-v`, `--version`: Display the current version.
- `-V`, `--verbose`: Enable verbose logging.Generate a summary with a custom prompt:
```bash
./summarize -w -1 -p "Summarize these files:" README.md CONFIG.md
```Generate a summary, saving the output to a file:
```bash
./summarize -o output.txt README.md CONFIG.md
```Generate a summary with custom word wrap width:
```bash
./summarize -w 100 README.md
```### Testing
`go test` depends on a local Ollama server being up and running, and will attempt to download and use various models.
### General info
* Version: 2.7.1
* License: Apache 2