Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/williamkaroldicioccio/open_local_ui

OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.
https://github.com/williamkaroldicioccio/open_local_ui

ai dart flutter linux llm localllm macos ollama ui windows

Last synced: 3 days ago
JSON representation

OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.

Awesome Lists containing this project

README

        

# OpenLocalUI

![coverage](./app/coverage_badge.svg?sanitize=true)
![build](https://github.com/WilliamKarolDiCioccio/open_local_ui/actions/workflows/flutter-build.yml/badge.svg)

![](https://img.shields.io/badge/Dart-0175C2?style=for-the-badge&logo=dart&logoColor=white)
![](https://img.shields.io/badge/Flutter-02569B?style=for-the-badge&logo=flutter&logoColor=white)
![](https://img.shields.io/badge/Python-3776AB?style=for-the-badge&logo=python&logoColor=white)

![](https://img.shields.io/badge/Windows-0078D6?style=for-the-badge&logo=windows&logoColor=white)
![](https://img.shields.io/badge/mac%20os-000000?style=for-the-badge&logo=apple&logoColor=white)
![](https://img.shields.io/badge/Linux-FCC624?style=for-the-badge&logo=linux&logoColor=black)



[**See more screenshots**](https://github.com/WilliamKarolDiCioccio/open_local_ui/blob/main/.github/images/IMAGES.md)

## Table of Contents

1. [What is OpenLocalUI](#-what-is-openlocalui)
2. [Features](#-features)
3. [Roadmap](#%EF%B8%8F-roadmap)
4. [Installation](#-installation)
5. [Contributing](#-contributing)
6. [License](#-license)
7. [Support](#-support)
8. [Contact](#-contact)
9. [Related Projects](#-related-projects)

## 🚀 What is OpenLocalUI

OpenLocalUI is a Flutter-based desktop application designed for Windows and macOS users. It aims to provide a user-friendly interface for running LLMs (Large Language Models) locally without the need for complex setups like WSL or Docker containers. Taking inspiration from OpenWebUI, which offers similar functionality in a browser-based environment, OpenLocalUI brings the convenience of a native desktop app.

## 🔥 Features

1. **Native Desktop Experience**: OpenLocalUI is designed specifically for Windows and macOS platforms, ensuring seamless integration with your operating system.

2. **LLM Execution**: Run OLLAMA (Open Language Learning and Modeling Architecture) based models directly from your desktop, eliminating the need for external dependencies like WSL or Docker containers.

3. **MIT License**: OpenLocalUI is licensed under the permissive MIT License, encouraging contributions from the community and fostering an open-source development environment.

## 🛣️ Roadmap

Despite its simplicity, OpenLocalUI has enormous potential for growth and enhancement. Based on the LangChain Dart API, future updates will focus on adding more features and improving usability. Planned features include:

1. ✅ **Model Customization**: Enhance the ability to customize LLM models according to specific needs.
2. ✅ **Image and File Embedding**: Enable embedding images and files directly into the application for more versatile usage.
3. ⚒️ **Web and Docs Search**: Integrate search functionality to allow users to access and retrieve information from web pages and local documents, making it easier to pull in relevant content for interactions.
4. ✅ **Model Cross Database**: Maintaining our own Ollama compatible models database will allow users to browse through a wide offer of LLMs from all sources (Ollama Library, Huggingface, 3rd parties of any kind).
5. ❌ **Server Configuration and Mobile App**: Provide support for configuring the UI to run on different server environments, with an accompanying mobile app for easy, on-the-go access to the platform.
6. ⚠️ **OpenLocalUI-based Ecosystem**: Develop an ecosystem around OpenLocalUI, including plugins, community-driven enhancements, and third-party integrations, fostering a collaborative development environment. WE NEED YOU!

## 💻 Installation

OpenLocalUI requires OLLAMA to function. Just check out the latest release and download the appropriate version based on your platform; in the folder you'll find both OpenLocalUI's and OLLAMA's installers.

## 📝 Contributing

Contributions to OpenLocalUI are highly encouraged and welcomed. Whether you're a developer, designer, or enthusiast, there are various ways to contribute:

- **Code Contributions**: Help improve the application by submitting code patches, bug fixes, or new features.
- **Documentation**: Improve existing documentation or create new guides to help users understand and use OpenLocalUI effectively.
- **Feedback and Suggestions**: Share your thoughts, ideas, and feedback to help shape the future development of OpenLocalUI.

Please refer to the [`CONTRIBUTING.md`](CONTRIBUTING.md) file for more details on how to contribute.

## 📃 License

OpenLocalUI comes under the permissive MIT License to encourage contributions. See the [`LICENSE.md`](LICENSE.md) file for more information.

## 💖 Support

Buy Me A Coffee

If you will you can foster development of OpenLocalUI with a small donation. It's a symbolical action, any amount, even the smallest one, means a lot to me. Thank you for your time regardless of what you'll do!

## 🗨️ Contact

[![](https://dcbadge.limes.pink/api/server/S82WPJbPpz)](https://discord.gg/S82WPJbPpz)

You can join our Discord server to get help or take part in the devlopment of OpenLocalUI. If you want to acquire any additional informations you can always write to my [email address](mailto:[email protected]).

## ⚒️ Related Projects

- [Tacotron.CPP](https://github.com/WilliamKarolDiCioccio/tacotron.cpp): a [llama.cpp](https://github.com/ggerganov/llama.cpp) inpired local accelerated inference engine to TTS and STT models for OpenLocalUI.
- [gpu_info](https://github.com/WilliamKarolDiCioccio/gpu_info): a dart package to retieve GPU details to model suggestions to the user with more or less no reason to exist outside of OpenLocalUI.