https://github.com/badboysm890/ClaraVerse
Clara – A privacy-first, client-side AI assistant WebUI for Ollama. No backend, no data leaks. Keep your conversations completely yours, If you like it, Don't Forget to Give a Star 🌟
https://github.com/badboysm890/ClaraVerse
llm ollama opensourcellms webui
Last synced: 8 months ago
JSON representation
Clara – A privacy-first, client-side AI assistant WebUI for Ollama. No backend, no data leaks. Keep your conversations completely yours, If you like it, Don't Forget to Give a Star 🌟
- Host: GitHub
- URL: https://github.com/badboysm890/ClaraVerse
- Owner: badboysm890
- License: gpl-3.0
- Created: 2025-03-10T17:37:36.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2025-03-27T03:46:43.000Z (8 months ago)
- Last Synced: 2025-03-27T04:28:59.752Z (8 months ago)
- Topics: llm, ollama, opensourcellms, webui
- Language: TypeScript
- Homepage: https://claraverse.netlify.app
- Size: 10.8 MB
- Stars: 268
- Watchers: 5
- Forks: 17
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- StarryDivineSky - badboysm890/ClaraVerse
- awesome - badboysm890/ClaraVerse - ClaraVerse is a privacy-first, fully local AI workspace featuring a Local LLM chat powered by LLama.cpp, along with support for any provider, tool calling, agent building, Stable Diffusion, and n8n-style automation. It requires no backend or API keys—just your stack and machine. (TypeScript)
- definitive-opensource - ClaraVerse - first, fully local AI workspace featuring a Local LLM chat powered by LLama.cpp, along with support for any provider, tool calling, agent building, Stable Diffusion, and n8n-style automation. It requires no backend or API keys—just your stack and machine. | `Cross` `SelfHost` | **3.6k** | (Table of Contents / All In One)
README
Clara
Privacy-First AI Assistant & Agent Builder
Chat with AI, create intelligent agents, and turn them into fully functional apps—powered entirely by open-source models running on your own device.
[](https://clara-ollama.netlify.app/)
Try Clara Online | Download Desktop App
## 🔒 Privacy First
- **Local Execution**: Clara connects directly to Ollama and uses open-source language and image generation models—**all running on your device**.
- **No Third-Party Clouds**: Your data never leaves your machine. Zero telemetry, zero tracking.
- **Open-Source Technology**: Built to leverage community-driven innovations, giving you full control over your AI stack.
# ✨ Key Features
AI Assistant
Chat with any Ollama-compatible model, including multimodal models that understand images:
🎨 Image Generation
Create amazing images from text prompts using Stable Diffusion models with ComfyUI integration:
🏗️ Intelligent Agent Builder
Design custom AI agents with a node-based editor, then convert them into standalone apps without leaving Clara:
🖼️ Image Gallery
Browse, search, and manage all generated images in one convenient gallery:
## 🚀 Installation Options
### 1. Docker (Recommended for Windows & Linux)
```bash
# Pull the image
docker pull claraverse/clara-ollama:latest
# Run with auto-restart
docker run -d --restart always -p 8069:8069 claraverse/clara-ollama:latest
```
Then visit http://localhost:8069 in your browser.
### 2. Native Desktop Apps
#### macOS (Signed)
- [Download .dmg installer](https://github.com/badboysm890/ClaraVerse/releases/tag/v1.0.4)
- Universal binary (works on both Intel and Apple Silicon)
- Fully signed and notarized for enhanced security
#### Linux (Signed)
- [Download .AppImage](https://github.com/badboysm890/ClaraVerse/releases/tag/v0.2.0)
- Runs on most Linux distributions
- No installation required
#### Windows
- We recommend using the Docker version for best performance and security
- If you need the native app: [Download .exe installer](https://github.com/badboysm890/ClaraVerse/releases/tag/v1.0.4)
- **I dont have money for signing it 😢**
### 3. Web Version
- [Try Clara Online](https://clara-ollama.netlify.app/)
- Requires local Ollama installation
### Prerequisites
1. **Install Ollama** (Required for all versions except Docker)
Download from [Ollama's website](https://ollama.ai/download)
2. **Connect**
Default Ollama endpoint: `http://localhost:11434`
## 📱 Download Desktop App
For faster performance and offline convenience, download the native desktop version:
- [Windows Installer (.exe)](https://github.com/badboysm890/ClaraVerse/releases/tag/v1.0.4)
- [macOS Installer (.dmg)](https://github.com/badboysm890/ClaraVerse/releases/tag/v1.0.4)
- [Linux AppImage (.AppImage)](https://github.com/badboysm890/ClaraVerse/releases/tag/v0.2.0)
## Mac Distribution Note
### For Mac Users Installing This App
If you see a message that the app is damaged or can't be opened:
1. Right-click (or Control+click) on the app in Finder
2. Select "Open" from the context menu
3. Click "Open" on the security dialog
4. If still blocked, go to System Preferences > Security & Privacy > General and click "Open Anyway"
This happens because the app is not notarized with Apple. This is perfectly safe, but macOS requires this extra step for unsigned applications.
### For Developers
Building for macOS:
- **Development build** (no notarization): `npm run electron:build-mac-dev`
- **Production build** (with notarization, requires Apple Developer Program):
1. Set environment variables `APPLE_ID`, `APPLE_ID_PASSWORD` (app-specific password), and `APPLE_TEAM_ID`
2. Run `npm run electron:build-mac`
To get an Apple Team ID, join the [Apple Developer Program](https://developer.apple.com/programs/).
## 👩💻 Dev Zone
### Development Setup
```bash
# Clone the repository
git clone https://github.com/badboysm890/ClaraVerse.git
cd clara-ollama
# Install dependencies
npm install
# Start development server (web)
npm run dev
# Start development server (desktop)
npm run electron:dev
```
### Remote Ollama Connection
If Ollama runs on another machine:
1. Enable CORS in Ollama (`~/.ollama/config.json`):
```json
{
"origins": ["*"]
}
```
2. In Clara settings, specify: `http://{IP_ADDRESS}:11434`
### Building for Production
```bash
# Build web version
npm run build
# Build desktop app
npm run electron:build
```
## 🤝 Support & Contact
Have questions or need help? Reach out via **praveensm890@gmail.com**.