https://github.com/yjg30737/pyqt-text-generation-inference-gui
GUI version of text-generation-inference
https://github.com/yjg30737/pyqt-text-generation-inference-gui
huggingface pyqt pyqt5 text-generation text-generation-inference text-generation-webui
Last synced: about 1 month ago
JSON representation
GUI version of text-generation-inference
- Host: GitHub
- URL: https://github.com/yjg30737/pyqt-text-generation-inference-gui
- Owner: yjg30737
- License: mit
- Created: 2023-08-31T08:37:12.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-09-01T00:00:28.000Z (over 1 year ago)
- Last Synced: 2025-03-27T00:54:55.593Z (about 2 months ago)
- Topics: huggingface, pyqt, pyqt5, text-generation, text-generation-inference, text-generation-webui
- Language: Python
- Homepage:
- Size: 14.6 KB
- Stars: 4
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# pyqt-text-generation-inference-gui
GUI version of text-generation-inferenceMy computer is not good enough to test this out (because my good old pal OutOfMemoryError!), so how about you try this?
## Requirements
* PyQt5>=5.14 (for GUI)
* ansi2html (for showing colorful console)
* text_generation (for using text-generation-inference)## How to Use
Before using this, you need to install Docker Desktop and run it. This will not work if you don't have it
1. clone this!
2. pip install -r requirements.txt
3. cd src
4. python main.pyAdd any models that you want to use in main.py. It shows in "Model" combobox. You can see the combobox in "Result" section below.

#### You need Access Tokens if you want to test llama2, falcon, etc.
## Result


I'm using bigscience/bloom-560m in this example.
## TroubleShooting
### Server is not running?This error indicates that 127.0.0.1:8080 is not running. Run the docker-desktop,
search the port with
netstat -ano | findstr 8080
and remove any service & software which port 8080 is allocated.
## Note
This project is not complete, but i probably rarely touch this one, so i need collaborator :)
If you want to collaborate feel free to join. I want to work with a team, please.