https://github.com/perpendicularai/sekernel_for_llm_ui
This is the repository for the UI for the SeKernel_for_LLM module
https://github.com/perpendicularai/sekernel_for_llm_ui
chat database-management internet llama-cpp-python pyqt5 semantic-kernel
Last synced: about 1 year ago
JSON representation
This is the repository for the UI for the SeKernel_for_LLM module
- Host: GitHub
- URL: https://github.com/perpendicularai/sekernel_for_llm_ui
- Owner: perpendicularai
- License: mit
- Created: 2024-08-14T02:24:28.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-08-17T12:13:04.000Z (over 1 year ago)
- Last Synced: 2025-01-23T17:55:38.348Z (over 1 year ago)
- Topics: chat, database-management, internet, llama-cpp-python, pyqt5, semantic-kernel
- Language: Python
- Homepage:
- Size: 1.91 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# SeKernel_for_LLM_UI
This is the repository for the UI for the SeKernel_for_LLM module
## Requirements
- pyqt5
- llama-cpp-python
- markdown
- pyqtspinner
- pyttsx3
## How to:
- Clone the repo `git clone https://github.com/perpendicularai/SeKernel_for_LLM_UI.git`
- Ensure that you have llama-cpp-python installed and running
- Add your model to the `kernel.py` script
- Launch the UI by running `python sekernel_ui.py`
-- Please note : Only internet-connected chat is supported. If you have the skills, you can checkout the plugins.py module to add more functionality to your UI.
## Short-films
https://github.com/user-attachments/assets/a6e75136-bd3f-4960-8791-6f83094f2123