https://github.com/hardcodedev777/unityneurospeech
The world’s first game framework that lets you talk to AI in real time — locally.
https://github.com/hardcodedev777/unityneurospeech
ai coqui-tts csharp framework llm ollama speech-to-text stt text-to-speech tts unity unity-framework unityneurospeech whisper
Last synced: 7 days ago
JSON representation
The world’s first game framework that lets you talk to AI in real time — locally.
- Host: GitHub
- URL: https://github.com/hardcodedev777/unityneurospeech
- Owner: HardCodeDev777
- License: mit
- Created: 2025-07-05T13:03:23.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2025-07-06T12:57:12.000Z (7 months ago)
- Last Synced: 2025-07-06T13:45:31.904Z (7 months ago)
- Topics: ai, coqui-tts, csharp, framework, llm, ollama, speech-to-text, stt, text-to-speech, tts, unity, unity-framework, unityneurospeech, whisper
- Language: C#
- Homepage: https://hardcodedev777.github.io/UnityNeuroSpeech/
- Size: 702 KB
- Stars: 1
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README







#
> **Make your Unity characters hear, think, and talk — using real voice AI. Locally. No cloud.**
---
UnityNeuroSpeech is an open-source framework for creating **fully voice-interactive AI agents** inside Unity.
It connects:
- 🧠 **Whisper** (STT) – converts your speech into text
- 💬 **Ollama** (LLM) – generates smart responses
- 🗣️ **XTTS** (TTS) – speaks back with *custom voice + emotions*
All locally. All offline.
No subscriptions, no accounts, no OpenAI API keys.
---
## 🚀 What can you build with UnityNeuroSpeech?
- 🎮 AI characters that understand your voice and reply in real time
- 🗿 NPCs with personality and memory
- 🧪 Experiments in AI conversation and narrative design
- 🕹️ Voice-driven gameplay mechanics
- 🤖 Interactive bots with humanlike voice responses
---
## ✨ Core Features
| Feature | Description |
|--------|--------------------------------------------------------------------------------------------|
| 🎙️ **Voice Input** | Uses [whisper.unity](https://github.com/Macoron/whisper.unity) for accurate speech-to-text |
| 🧠 **AI Brain (LLM)** | Easily connect to any local model via [Ollama](https://ollama.com) |
| 🗣️ **Custom TTS** | Supports any voice with [Coqui XTTS](https://github.com/idiap/coqui-ai-TTS) |
| 😄 **Emotions** | Emotion tags (``, ``, etc.) parsed automatically from LLM |
| 🎬 **Actions** | Action tags (``, ``, etc.) also parsed automatically from LLM |
| 🎛️ **Agent API** | Subscribe to events like `BeforeTTS()` to monitor your agents |
| 📝 **History Saving in JSON** | Save dialog history between player and LLM in JSON with/without AES encryption |
| 🛠️ **Editor Tools** | Create, manage and customize agents inside Unity Editor easily |
| 🧱 **No Cloud** | All models and voice run locally on your machine |
| 🌐 **Multilingual** | Works with over **15+ languages**, including English, Russian, Chinese, etc. |
| 🔊 **Multiple Voices and Languages for Multiple Agents** | Each Agent can have each voice file for any available laguage |
| ⚡ **High Performance** | Uses **UniTask** instead of Coroutines and Tasks for optimal performance. |
| 🔧 **Full Build Support** | Full compatibility with both **Mono** and **IL2CPP** scripting backends. |
---
## 🧪 Built with:
- 🧠 [OllamaSharp](https://github.com/awaescher/OllamaSharp)
- 🎤 [whisper.unity](https://github.com/Macoron/whisper.unity)
- ⚡ [UniTask](https://github.com/Cysharp/UniTask)
- 🧊 [Coqui XTTS](https://github.com/idiap/coqui-ai-TTS)
- 🖥️ [UV](https://github.com/astral-sh/uv)
- 🤖 Unity 6
---
## ⚙️ Compatibility
| Scripting backend | Windows | Other platforms |
|-------------------|---------|--------------------|
| Mono | ✅ | ❌ (not planned) |
| IL2CPP | ✅ | ❌ (not planned) |
---
## 📚 Getting Started
See [UnityNeuroSpeech official documentation](https://hardcodedev777.github.io/UnityNeuroSpeech/).
---
## 😎 Who made this?
UnityNeuroSpeech was created by [HardCodeDev](https://github.com/HardCodeDev777) — solo dev from Russia.
---
## 🗒️ License
UnityNeuroSpeech is licensed under the **MIT License**.
For other Licenses, see [Licenses](LICENSES.md).