Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/shinomakoi/magi_llm_gui
A Qt GUI for large language models
https://github.com/shinomakoi/magi_llm_gui
exllama gui llama llamacpp llm python qt
Last synced: about 1 month ago
JSON representation
A Qt GUI for large language models
- Host: GitHub
- URL: https://github.com/shinomakoi/magi_llm_gui
- Owner: shinomakoi
- License: apache-2.0
- Created: 2023-04-08T14:16:17.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-11-17T17:16:50.000Z (about 1 year ago)
- Last Synced: 2024-10-10T06:31:36.923Z (about 1 month ago)
- Topics: exllama, gui, llama, llamacpp, llm, python, qt
- Language: Python
- Homepage:
- Size: 645 KB
- Stars: 40
- Watchers: 3
- Forks: 4
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Magi LLM GUI
A Qt GUI for large language models for Windows, Linux (and Mac?)Uses Exllama and llama.cpp as backends.
**NOTICE**
This has been replaced by https://github.com/shinomakoi/AI-Messenger
**Installation:**
First make sure Python (3.10+ recommended) and GIT are installed. Then:
```
git clone https://github.com/shinomakoi/magi_llm_gui
cd magi_llm_gui
```
Optionally create a virtual environment (recommended)```
python -m venv .magi_venv
source ./.magi_venv/bin/activate ### For Linux
.\.magi_venv\Scripts\activate ## For Windows
```
```
pip install -r requirements.txt
```
**llama.cpp**llama-cpp-python is included as a backend for CPU, but you can optionally install with GPU support, e.g. ```CMAKE_ARGS="-DLLAMA_CUBLAS=on" FORCE_CMAKE=1 pip install llama-cpp-python``` for CUDA acceleration.
See:
https://github.com/abetlen/llama-cpp-python/#installation-with-openblas--cublas--clblast--metalCan also use the llama.cpp server API if it's launched. See: https://github.com/ggerganov/llama.cpp/tree/master/examples/server
**Exllama**
To install Exllama, follow the install instructions at https://github.com/turboderp/exllama inside the magi_llm_gui folder. Also requires CUDA Toolkit installed (installed with Linux package manager or downloaded from NVIDIA for Windows).
```
pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu118
pip install safetensors sentencepiece ninja
git clone https://github.com/turboderp/exllama
```**Usage:**
To launch the Magi LLM GUI, use:
```
python magi_llm_app.py
```
You can set the generation parameters in Settings > Parameters
![image](https://github.com/shinomakoi/magi_llm_gui/assets/112139428/b02af911-e7eb-4353-a9f4-7b1f1f23bfe2)Uses https://github.com/abetlen/llama-cpp-python for llama.cpp support
Uses https://github.com/UN-GCPDS/qt-material for themes