Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/byebyebruce/ollamax
Ollama library wrapper
https://github.com/byebyebruce/ollamax
ai gguf llama2 llamacpp llm local-llm ollama qwen
Last synced: about 1 month ago
JSON representation
Ollama library wrapper
- Host: GitHub
- URL: https://github.com/byebyebruce/ollamax
- Owner: byebyebruce
- Created: 2024-03-29T04:44:55.000Z (8 months ago)
- Default Branch: master
- Last Pushed: 2024-04-28T08:41:30.000Z (7 months ago)
- Last Synced: 2024-09-30T22:33:35.554Z (about 2 months ago)
- Topics: ai, gguf, llama2, llamacpp, llm, local-llm, ollama, qwen
- Language: Go
- Homepage:
- Size: 117 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Ollamax
Ollamax is a simple and easy to use library for making a local LLM app.
It is based on the [Ollama](https://github.com/ollama/ollama)## Demos
- [Local LLM Chat Demo](demo/chat)
- [Local LLM Embedding](demo/embedding)
- [Local LLM Vision](demo/vision)## How to use
go >= 1.221. Init your go module
```bash
go mod init
```
2. Add submodule
```bash
git submodule add https://github.com/byebyebruce/ollamax.git
```
3. Init go work
```bash
go work init . ./ollamax ./ollamax/ollama
```4. Build library
```bash
make -C ollamax
```
5. Write a test code
```go
import(
"github.com/byebyebruce/ollamax"
)if err := ollamax.Init(); err != nil {
log.Fatalln(err)
}
defer ollamax.Cleanup()
llm, err := ollamax.NewWithAutoDownload("qwen:0.5b")
if err != nil {
panic(err)
}
defer llm.Close()
llm.Chat...
```## Where to find models
https://ollama.com/library