https://github.com/mushinbush/tabbyui
A Streamlit web UI for exllamav2's API (tabbyAPI).
https://github.com/mushinbush/tabbyui
exl2 exllamav2 tabbyapi webui
Last synced: 4 months ago
JSON representation
A Streamlit web UI for exllamav2's API (tabbyAPI).
- Host: GitHub
- URL: https://github.com/mushinbush/tabbyui
- Owner: mushinbush
- License: gpl-3.0
- Created: 2024-08-14T04:43:02.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2024-09-10T07:05:45.000Z (8 months ago)
- Last Synced: 2025-01-09T18:50:46.075Z (4 months ago)
- Topics: exl2, exllamav2, tabbyapi, webui
- Language: Python
- Homepage:
- Size: 37.1 KB
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: .github/readme.md
- License: LICENSE
Awesome Lists containing this project
README
## Description:
This repository features a web UI designed for interacting with [ExllamaV2](https://github.com/turboderp/exllamav2)'s API ([TabbyAPI](https://github.com/theroyallab/tabbyAPI/)). Features:
- Easy one-click installation with venv, simply download and deploy directly (requires python installed).
- Load or unload ExllamaV2 models (exl2 models).
- Support for basic features for engaging with LLMs, including text generation, conversation (under construction).## Usage:
### Windows
1. Install [Python](https://www.python.org/). Make sure to check "Add Python X.X to PATH" during the installation.
2. Set up [TabbyAPI](https://github.com/theroyallab/tabbyAPI/) according to its instructions. You will obtain the API URL and API KEY from this step.
3. Clone this repository to your local machine, or simply download it (Code -> Download ZIP).
4. Run the `start.bat` file. This will create a virtual environment (venv) and install necessary dependencies.
5. Open the URL indicated in the terminal in your browser (e.g., http://localhost:8501).
6. All set! You can input the API URL and API key in the left sidebar, as well as switch between models.