{"id":29658313,"url":"https://github.com/HardCodeDev777/UnityNeuroSpeech","last_synced_at":"2025-07-22T09:07:16.844Z","repository":{"id":303229077,"uuid":"1014305792","full_name":"HardCodeDev777/UnityNeuroSpeech","owner":"HardCodeDev777","description":"The world’s first game framework that lets you talk to AI in real time — locally.","archived":false,"fork":false,"pushed_at":"2025-07-18T18:25:29.000Z","size":726,"stargazers_count":3,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-07-18T21:49:29.791Z","etag":null,"topics":["ai","coqui-tts","csharp","framework","llm","ollama","speech-to-text","stt","text-to-speech","tts","unity","unity-framework","unityneurospeech","whisper"],"latest_commit_sha":null,"homepage":"https://hardcodedev777.github.io/UnityNeuroSpeech/","language":"C#","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/HardCodeDev777.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-07-05T13:03:23.000Z","updated_at":"2025-07-18T18:25:32.000Z","dependencies_parsed_at":"2025-07-06T13:45:36.173Z","dependency_job_id":"a91549c7-e1d7-4561-b25f-9d298b842f5d","html_url":"https://github.com/HardCodeDev777/UnityNeuroSpeech","commit_stats":null,"previous_names":["hardcodedev777/unityneurospeech"],"tags_count":1,"template":false,"template_full_name":null,"purl":"pkg:github/HardCodeDev777/UnityNeuroSpeech","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FUnityNeuroSpeech","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FUnityNeuroSpeech/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FUnityNeuroSpeech/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FUnityNeuroSpeech/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/HardCodeDev777","download_url":"https://codeload.github.com/HardCodeDev777/UnityNeuroSpeech/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HardCodeDev777%2FUnityNeuroSpeech/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":266463050,"owners_count":23932895,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-07-22T02:00:09.085Z","response_time":66,"last_error":null,"robots_txt_status":null,"robots_txt_updated_at":null,"robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","coqui-tts","csharp","framework","llm","ollama","speech-to-text","stt","text-to-speech","tts","unity","unity-framework","unityneurospeech","whisper"],"created_at":"2025-07-22T09:02:18.458Z","updated_at":"2025-07-22T09:07:16.835Z","avatar_url":"https://github.com/HardCodeDev777.png","language":"C#","funding_links":[],"categories":["Open Source Repositories","\u003cspan id=\"speech\"\u003eSpeech\u003c/span\u003e"],"sub_categories":["Framework","\u003cspan id=\"tool\"\u003eLLM (LLM \u0026 Tool)\u003c/span\u003e"],"readme":"![Unity](https://img.shields.io/badge/Unity-unity?logo=Unity\u0026color=%23000000)\n![C#](https://img.shields.io/badge/C%23-%23512BD4?logo=.NET)\n![Ollama](https://img.shields.io/badge/Ollama-%23000000?logo=Ollama)\n![License](https://img.shields.io/github/license/HardCodeDev777/UnityNeuroSpeech?color=%2305991d)\n![Last commit](https://img.shields.io/github/last-commit/HardCodeDev777/UnityNeuroSpeech?color=%2305991d)\n![Tag](https://img.shields.io/github/v/tag/HardCodeDev777/UnityNeuroSpeech)\n![Top lang](https://img.shields.io/github/languages/top/HardCodeDev777/UnityNeuroSpeech)\n\n\u003cdiv align=\"center\"\u003e\n  \u003cimg src=\"docs/media/logo.png\"\u003e\n\u003c/div\u003e\n\n#\n\n\u003e **Make your Unity characters hear, think, and talk — using real voice AI. Locally. No cloud.**\n\n---\n\nUnityNeuroSpeech is a lightweight and open-source framework for creating **fully voice-interactive AI agents** inside Unity.  \nIt connects:\n\n- 🧠 **Whisper** (STT) – converts your speech into text  \n- 💬 **Ollama** (LLM) – generates smart responses  \n- 🗣️ **XTTS** (TTS) – speaks back with *custom voice + emotions*\n\nAll locally. All offline.  \nNo subscriptions, no accounts, no OpenAI API keys.\n\n---\n\n## 🚀 What can you build with UnityNeuroSpeech?\n\n- 🎮 AI characters that understand your voice and reply in real time  \n- 🗿 NPCs with personality and memory  \n- 🧪 Experiments in AI conversation and narrative design  \n- 🕹️ Voice-driven gameplay mechanics  \n- 🤖 Interactive bots with humanlike voice responses\n\n---\n\n## ✨ Core Features\n\n| Feature | Description                                                                                |\n|--------|--------------------------------------------------------------------------------------------|\n| 🎙️ **Voice Input** | Uses [whisper.unity](https://github.com/Macoron/whisper.unity) for accurate speech-to-text |\n| 🧠 **AI Brain (LLM)** | Easily connect to any local model via [Ollama](https://ollama.com)                         |\n| 🗣️ **Custom TTS** | Supports any voice with [Coqui XTTS](https://github.com/coqui-ai/TTS)                      |\n| 😄 **Emotions** | Emotion tags (`\u003chappy\u003e`, `\u003csad\u003e`, etc.) parsed automatically from LLM                      |\n| 🎛️ **Agent API** | Subscribe to events like `BeforeTTS()` or access `AgentState` directly                     |\n| 🛠️ **Editor Tools** | Create, manage and customize agents inside Unity Editor                                    |\n| 🧱 **No cloud** | All models and voice run locally on your machine                                           |\n| 🌐 **Multilingual** | Works with over **15+ languages**, including English, Russian, Chinese, etc.               |\n\n---\n\n## 🧪 Built with:\n\n- 🧠 [`Microsoft.Extensions.AI`](https://learn.microsoft.com/en-us/dotnet/ai/) (Ollama)\n- 🎤 [`whisper.unity`](https://github.com/Macoron/whisper.unity)\n- 🐍 [Python Flask server](server/) (for TTS)\n- 🧊 [Coqui XTTS model](https://github.com/coqui-ai/TTS)\n- 🤖 Unity 6\n\n---\n\n## 📚 Get Started\n\nSee [UnityNeuroSpeech official website](https://hardcodedev777.github.io/UnityNeuroSpeech/).\n\n---\n\n## 😎 Who made this?\n\nUnityNeuroSpeech was created by [HardCodeDev](https://github.com/HardCodeDev777) —  \nindie dev from Russia who just wanted to make AI talk in Unity.\n\n---\n\n## 🗒️ License\n\nUnityNeuroSpeech is licensed under the **MIT License**.\nFor other Licenses, see [Licenses](docs/other/licenses.md).\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FHardCodeDev777%2FUnityNeuroSpeech","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FHardCodeDev777%2FUnityNeuroSpeech","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FHardCodeDev777%2FUnityNeuroSpeech/lists"}