https://github.com/niiv3au/ollamator
Chat offline and localy with many AI models. OllamaTor is a Python Eel based front-end for the Ollama API.
https://github.com/niiv3au/ollamator
eel ollama ollama-gui ollama-python python
Last synced: 2 months ago
JSON representation
Chat offline and localy with many AI models. OllamaTor is a Python Eel based front-end for the Ollama API.
- Host: GitHub
- URL: https://github.com/niiv3au/ollamator
- Owner: NiiV3AU
- Created: 2025-02-26T12:31:42.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-03-11T20:41:43.000Z (2 months ago)
- Last Synced: 2025-03-11T21:22:11.200Z (2 months ago)
- Topics: eel, ollama, ollama-gui, ollama-python, python
- Language: HTML
- Homepage: https://niiv3au.github.io/GetOllamaTor
- Size: 3.34 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Welcome to OllamaTor :wave:
OllamaTor is a user-friendly desktop application that brings the power of Ollama's local Large Language Models (LLMs) to your fingertips.
Chat with AI, adjust settings, and monitor system usage.
## Features
* **Easy Chat:** Select a model and start chatting immediately.
* **Model Management:** Dropdown selection to choose from your local LLMs collection.
* **Customizable:** Adjust temperature and chat history length.
* **Resource Monitoring:** Real-time CPU, RAM, and GPU (load + memory) usage monitor. + AI performance "TPM" (tokens-per-minute)**TO-DO:**
* ~~Stop Button to stop generating a response~~ -> Available in `v0.0.3`
* fixing the math rendering (KaTeX)
* Customizable Copilots (inital prompt for the AI to get a better understanding on what it's working on)## Installation
|[Download Ollama.exe](https://github.com/NiiV3AU/OllamaTor/releases/latest)|
|-|## Getting Started
1. **Select a Model:** Choose a model from the dropdown.
2. **Chat:** Type your prompt & click "Send".
4. **Settings:** Use the gear icon to change temperature and history.
5. **Help:** Use the help icon to start the step-by-step tour, download Ollama & get instructions to properly install models## Requirements
* Windows 10/11 (other OS not tested)
* Chrome, Edge or Electron
* Ollama
* Downloaded Ollama models
* Python (for source code execution only; not needed for the compiled `.exe`):
* `eel`
* `requests`
* `psutil`
* `nvidia-ml-py`## Contributing
Report bugs or suggest features via GitHub Issues. Pull requests welcome!