https://github.com/luuuppi/stellaris
Ollama web UI
https://github.com/luuuppi/stellaris
ai ai-chatbot llm-ui local-ai local-llm ollama ollama-chatbot ollama-ui ui
Last synced: about 2 months ago
JSON representation
Ollama web UI
- Host: GitHub
- URL: https://github.com/luuuppi/stellaris
- Owner: luuuppi
- License: mit
- Created: 2024-10-30T10:00:50.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2025-02-19T18:37:32.000Z (3 months ago)
- Last Synced: 2025-02-19T19:35:35.594Z (3 months ago)
- Topics: ai, ai-chatbot, llm-ui, local-ai, local-llm, ollama, ollama-chatbot, ollama-ui, ui
- Language: TypeScript
- Homepage: https://stellarisp.vercel.app
- Size: 603 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# Project Stellaris - web UI for Ollama
### ⭐ Main features
- Chating with Ollama models via web UI
- Storing all data locally on your browser
- Streaming response in real-time
- Markdown parsing with syntax highlighting
- Installing models via web UI## 🧐 How to use?
1. Install [Ollama](https://ollama.com/download)
2. Go to the [web app](https://ollama-hub.vercel.app) (no registration required)
3. Install model you want ([Full list of models](https://ollama.com/models))
4. Enjoy!## 🔧 Tech stack
- TypeScript
- React
- Zustand + Persist
- TanStack Router
- Tailwind CSS
- RadixUI
- Framer Motion
- Vite## 🏃 Get started locally
1. Install [Ollama](https://ollama.com/download)
2. Clone the repo on your computer
3. `npm i`
4. `npm run build && npm run preview`