https://github.com/inventwithdean/aurenia
Aurenia is a blazing fast native desktop study assistant
https://github.com/inventwithdean/aurenia
desktop-app lancedb llamacpp paddleocr rag rust tauri
Last synced: about 2 months ago
JSON representation
Aurenia is a blazing fast native desktop study assistant
- Host: GitHub
- URL: https://github.com/inventwithdean/aurenia
- Owner: inventwithdean
- License: apache-2.0
- Created: 2025-08-04T17:39:36.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2025-08-05T15:51:39.000Z (2 months ago)
- Last Synced: 2025-08-05T16:31:01.167Z (2 months ago)
- Topics: desktop-app, lancedb, llamacpp, paddleocr, rag, rust, tauri
- Language: JavaScript
- Homepage:
- Size: 32.1 MB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
![]()
Aurenia
Offline, multilingual, beautifully yours.
---
> Aurenia is a blazing fast native desktop study assistant supporting top 25 global languages natively with features like OCR and RAG, that runs completely offline.
![]()
## The Vision
In a world where AI is becoming synonymous with the cloud, Aurenia challenges the trade-off between intelligence and privacy. It brings a powerful AI tutor to your desktop that runs entirely on your machine, ensuring your data is always secure.
## Key Features
* **🧠 Intelligent RAG Chat:** Ask complex questions about your documents. Aurenia's custom, multi-step RAG pipeline finds the precise information you need.
* **🌐 Truly Multilingual:** Have conversations, get translations, and generate summaries in your native tongue, with support for top 25 global languages.
* **📝 Interactive Study Tools:** Go beyond passive reading. Instantly generate interactive multiple-choice quizzes from any page to test your understanding.
* **🔒 100% Local & Private:** All AI processing happens on your device. Your documents and chats never leave your computer. No internet connection required.
* **👀 OCR for All Documents:** A built-in OCR engine automatically makes scanned documents and images fully searchable and interactive.
* **✅ No External Dependencies:** Everything comes packaged in the app. Just install, add the models, and you're ready to go.
* **💻 Minimal System Requirements:** Runs efficiently on as little as 8GB of RAM without a dedicated graphics card, and utilizes CUDA when available.## Tech Stack
Aurenia is built with a modern, performance-focused stack:
| Component | Technology |
| :--- | :--- |
| **Application Framework** | [`Tauri`](https://v2.tauri.app/) (Rust + JS) |
| **AI Inference Engine** | [`llama.cpp`](https://github.com/ggml-org/llama.cpp) |
| **LLM Model** | Google's [`Gemma 3n`](https://huggingface.co/google/gemma-3n-E4B-it/tree/main) |
| **Embedding Model** | [`multilingual-e5-large`](https://huggingface.co/intfloat/multilingual-e5-large) |
| **Vector Database** | [`LanceDB`](https://github.com/lancedb/lancedb) |
| **OCR** | PaddleOCR (via [`paddle-ocr-rs`](https://github.com/mg-chao/paddle-ocr-rs)) |
| **PDF Handling** | [`pdf.js`](https://github.com/mozilla/pdf.js) |## Installation & Usage
Getting started with Aurenia is easy:
1. **Download the Installer:** Go to the [**Latest Release**](https://github.com/inventwithdean/aurenia/releases/latest) page and download the `.msi` file for Windows.
2. **Download the GGUF Models (Prefer Quantized):** You need two model files to run Aurenia.
* Download the LLM: [`Gemma 3n`](https://huggingface.co/unsloth/gemma-3n-E4B-it-GGUF)
* Download the Embedding Model: [`multilingual-e5-large`](https://huggingface.co/phate334/multilingual-e5-large-gguf)
3. **Place Models in Directory:** After installing Aurenia, place the two `.gguf` files you downloaded into the application's installation directory.
4. Rename the `Gemma-3n`'s gguf to `model.gguf` and `multilingual-e5-large`'s gguf to `emb_model.gguf`
5. **Launch Aurenia:** That's it! You can now open any PDF and start studying.## Find Out More
* **▶️ Watch the Full Video Demo:** [https://youtu.be/n_-dwJi9wO8]
* **📄 Read the Technical Blog Post:** [https://inventwithdean.github.io/blog/aurenia/]## License
This project is licensed under the Apache 2.0 - see the `LICENSE` file for details.
## Acknowledgements
This project would not be possible without the incredible open-source communities behind `llama.cpp`, `Tauri`, `LanceDB`, `pdfjs`, `PaddleOCR`, `intfloat/multilingual-e5-large` and the researchers at Google who developed and open-sourced the Gemma-3n models.