Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/hazelnutcloud/godot-llama-cpp

Run large language models in Godot.
https://github.com/hazelnutcloud/godot-llama-cpp

godot llamacpp

Last synced: 4 months ago
JSON representation

Run large language models in Godot.

Awesome Lists containing this project

README

        

godot-llama-cpp

Run large language models in [Godot](https://godotengine.org). Powered by [llama.cpp](https://github.com/ggerganov/llama.cpp).




![Godot v4.2](https://img.shields.io/badge/Godot-v4.2-%23478cbf?logo=godot-engine&logoColor=white)
![GitHub last commit](https://img.shields.io/github/last-commit/hazelnutcloud/godot-llama-cpp)
![GitHub License](https://img.shields.io/github/license/hazelnutcloud/godot-llama-cpp)

## Overview

This library aims to provide a high-level interface to run large language models in Godot, following Godot's node-based design principles.

```gdscript
@onready var llama_context = %LlamaContext

var messages = [
{ "sender": "system", "text": "You are a pirate chatbot who always responds in pirate speak!" },
{ "sender": "user", "text": "Who are you?" }
]
var prompt = ChatFormatter.apply("llama3", messages)
var completion_id = llama_context.request_completion(prompt)

while (true):
var response = await llama_context.completion_generated
print(response["text"])

if response["done"]: break
```

## Features
- Platform and compute backend support:
| Platform | CPU | Metal | Vulkan | CUDA |
|----------|-----|-------|--------|------|
| macOS | ✅ | ✅ | ❌ | ❌ |
| Linux | ✅ | ❌ | ✅ | 🚧 |
| Windows | ✅ | ❌ | 🚧 | 🚧 |
- Asynchronous completion generation
- Support any language model that llama.cpp supports in GGUF format
- GGUF files are Godot resources

## Roadmap
- [ ] Chat completions support via dedicated library for jinja2 templating in zig
- [ ] Grammar support
- [ ] Multimodal models support
- [ ] Embeddings
- [ ] Vector database using LibSQL

## Building & Installation

1. Download zig v0.13.0 from https://ziglang.org/download/
2. Clone the repository:
```bash
git clone --recurse-submodules https://github.com/hazelnutcloud/godot-llama-cpp.git
```
3. Copy the `godot-llama-cpp` addon folder in `godot/addons` to your Godot project's `addons` folder.
```bash
cp -r godot-llama-cpp/godot/addons/godot-llama-cpp /addons
```
4. Build the extension and install it in your Godot project:
```bash
cd godot-llama-cpp
zig build --prefix /addons/godot-llama-cpp
```
5. Enable the plugin in your Godot project settings.
6. Add the `LlamaContext` node to your scene.
7. Run your Godot project.
8. Enjoy!

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE.md) file for details.