https://github.com/smalls1652/localllm-chat
A stupid way to run Open WebUI locally. (Mirror)
https://github.com/smalls1652/localllm-chat
llm rust rust-lang tauri
Last synced: 3 months ago
JSON representation
A stupid way to run Open WebUI locally. (Mirror)
- Host: GitHub
- URL: https://github.com/smalls1652/localllm-chat
- Owner: Smalls1652
- License: mit
- Created: 2025-06-24T17:19:20.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-06-24T22:11:12.000Z (3 months ago)
- Last Synced: 2025-06-24T22:31:28.459Z (3 months ago)
- Topics: llm, rust, rust-lang, tauri
- Language: Rust
- Homepage: https://git.smalls.online/smalls/localllm-chat
- Size: 2.38 MB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
#  LocalLLM Chat
A stupid way to run Open WebUI locally without having to manage the container resources manually.

> [!CAUTION]
> This is primarily for my own personal usage. Your mileage may vary with building/using this. **All development and testing has been done on macOS.**> [!WARNING]
> Also this was sorta/kinda thrown together quickly, so, fair warning, you'll see some janky code. 😬## 🤝 License
The source code for this project is licensed with the [MIT License](LICENSE).