{"id":13645385,"url":"https://github.com/YOLOSHOW/YOLOSHOW","last_synced_at":"2025-04-21T14:30:53.673Z","repository":{"id":223113826,"uuid":"759246297","full_name":"YOLOSHOW/YOLOSHOW","owner":"YOLOSHOW","description":"YOLO SHOW -  YOLOv11 / YOLOv10 / YOLOv9 / YOLOv8 / YOLOv7 / YOLOv5 / RTDETR YOLO GUI based on Pyside6","archived":false,"fork":false,"pushed_at":"2024-10-31T09:17:34.000Z","size":276974,"stargazers_count":293,"open_issues_count":1,"forks_count":29,"subscribers_count":2,"default_branch":"master","last_synced_at":"2024-10-31T10:18:44.522Z","etag":null,"topics":["gui","rtdetr","yolo","yolo-show","yologui","yolov11","yolov5","yolov7","yolov8","yolov9"],"latest_commit_sha":null,"homepage":"https://swimmingliu.cn/posts/diary/yoloshow","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"agpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/YOLOSHOW.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-02-18T03:33:48.000Z","updated_at":"2024-10-31T10:11:13.000Z","dependencies_parsed_at":"2024-10-31T10:29:38.236Z","dependency_job_id":null,"html_url":"https://github.com/YOLOSHOW/YOLOSHOW","commit_stats":null,"previous_names":["swimmingliu/yoloshow","yoloshow/yoloshow"],"tags_count":3,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/YOLOSHOW%2FYOLOSHOW","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/YOLOSHOW%2FYOLOSHOW/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/YOLOSHOW%2FYOLOSHOW/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/YOLOSHOW%2FYOLOSHOW/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/YOLOSHOW","download_url":"https://codeload.github.com/YOLOSHOW/YOLOSHOW/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":223868104,"owners_count":17217030,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["gui","rtdetr","yolo","yolo-show","yologui","yolov11","yolov5","yolov7","yolov8","yolov9"],"created_at":"2024-08-02T01:02:34.327Z","updated_at":"2025-04-21T14:30:53.657Z","avatar_url":"https://github.com/YOLOSHOW.png","language":"Python","funding_links":[],"categories":["Applications"],"sub_categories":[],"readme":"# YOLOSHOW -  YOLOv5 / YOLOv7 / YOLOv8 / YOLOv9 / YOLOv10 / YOLOv11 / RTDETR / SAM / MobileSAM / FastSAM GUI based on Pyside6\n\n## Introduction\n\n***YOLOSHOW*** is a graphical user interface (GUI) application embed with `YOLOv5` `YOLOv7` `YOLOv8` `YOLOv9` `YOLOv10` `YOLOv11`  `RT-DETR` `SAM` `MobileSAM` `FastSAM` algorithm. \n\n \u003cp align=\"center\"\u003e \n  English \u0026nbsp; | \u0026nbsp; \u003ca href=\"https://github.com/SwimmingLiu/YOLOSHOW/blob/master/README_cn.md\"\u003e简体中文\u003c/a\u003e\n \u003c/p\u003e\n\n![YOLOSHOW-Screen](assest/YOLOSHOW-screen.png)\n\n## Demo Video\n\n`YOLOSHOW v1.x` : [YOLOSHOW-YOLOv9/YOLOv8/YOLOv7/YOLOv5/RTDETR GUI](https://www.bilibili.com/video/BV1BC411x7fW)\n\n`YOLOSHOW v2.x` : [YOLOSHOWv2.0-YOLOv9/YOLOv8/YOLOv7/YOLOv5/RTDETR GUI](https://www.bilibili.com/video/BV1ZD421E7m3)\n\n## Todo List\n\n- [x] Add `YOLOv9` `YOLOv10`  `RT-DETR`  `YOLOv11`  `SAM`  `MobileSAM`  `FastSAM` Algorithm\n- [x] Support Instance Segmentation （ `YOLOv5`  `YOLOv8`  `YOLOv11` `SAM`  `MobileSAM`  `FastSAM`）\n- [x] Support Pose Estimation （ `YOLOv8`  `YOLOv11`）\n- [x] Support Oriented Bounding Boxes ( `YOLOv8`  `YOLOv11` )\n- [x] Support Http Protocol in `RTSP` Function ( `Single` Mode )\n- [x] Add Model Comparison Mode（VS Mode）\n- [x] Support Dragging File Input\n- [ ] Tracking \u0026 Counting ( `Industrialization` )\n\n## Functions\n\n### 1. Support Image / Video / Webcam / Folder (Batch) / IPCam Object Detection\n\nChoose Image / Video / Webcam / Folder (Batch) / IPCam in the menu bar on the left to detect objects.\n\n### 2. Change Models / Hyper Parameters dynamically\n\nWhen the program is running to detect targets, you can change models / hyper Parameters\n\n1. Support changing model in `YOLOv5` / ` YOLOv7` / `YOLOv8` / `YOLOv9` / `YOLOv10` / `YOLOv11` / `RTDETR` / `YOLOv5-seg` / `YOLOv8-seg` `YOLOv11-seg` / `YOLOv8-pose` / `YOLOv11-pose` / `YOLOv8-obb` / `YOLOv11-obb` / `SAM` / `MobileSAM`  / `FastSAM` dynamically\n2. Support changing `IOU` / `Confidence` / `Delay time ` / `line thickness` dynamically\n\n### 3. Loading Model Automatically\n\nOur program will automatically detect  `pt` files including [YOLOv5 Models](https://github.com/ultralytics/yolov5/releases) /  [YOLOv7 Models](https://github.com/WongKinYiu/yolov7/releases/)  /  [YOLOv8 Models](https://github.com/ultralytics/assets/releases/)  / [YOLOv9 Models](https://github.com/WongKinYiu/yolov9/releases/)  / [YOLOv10 Models](https://github.com/THU-MIG/yolov10/releases/) / [YOLOv11 Models](https://github.com/ultralytics/assets/releases/) / [RT-DETR Models](https://github.com/ultralytics/assets/releases/)  / [SAM Models](https://github.com/ultralytics/assets/releases/) / [MobileSAM Models](https://github.com/ultralytics/assets/releases/) / [FastSAM Models](https://github.com/ultralytics/assets/releases/) that were previously added to the `ptfiles` folder.\n\nIf you need add the new `pt` file, please click `Import Model` button in `Settings` box to select your `pt` file. Then our program will put it into  `ptfiles` folder.\n\n**Notice :** \n\n1. All `pt` files are named including `yolov5` / `yolov7` / `yolov8` / `yolov9` / `yolov10` / `yolo11` / `rtdetr` / `sam` / `samv2` / `mobilesam` / `fastsam`.  (e.g. `yolov8-test.pt`)\n2. If it is a `pt` file of  segmentation mode, please name it including `yolov5n-seg` / `yolov8s-seg` / `yolo11-seg` .  (e.g. `yolov8n-seg-test.pt`)\n3. If it is a `pt` file of  pose estimation mode, please name it including `yolov8n-pose` / `yolo11n-pose` .  (e.g. `yolov8n-pose-test.pt`)\n4. If it is a `pt` file of  oriented bounding box mode, please name it including `yolov8n-obb` / `yolo11n-obb` .  (e.g. `yolov8n-obb-test.pt`)\n\n### 4. Loading Configures\n\n1.  After startup, the program will automatically loading the last configure parameters.\n2.  After closedown, the program will save the changed configure parameters.\n\n### 5. Save Results\n\nIf you need Save results, please click `Save Mode` before detection. Then you can save your detection results in selected path.\n\n### 6. Support Object Detection, Instance Segmentation and Pose Estimation \n\nFrom ***YOLOSHOW v3.0***，our work supports both Object Detection , Instance Segmentation, Pose Estimation and Oriented Bounding Box. Meanwhile, it also supports task switching between different versions，such as switching from `YOLOv5` Object Detection task to `YOLOv8` Instance Segmentation task.\n\n### 7. Support Model Comparison among Object Detection,  Instance Segmentation, Pose Estimation and Oriented Bounding Box\n\nFrom ***YOLOSHOW v3.0***，our work supports compare model performance among Object Detection, Instance Segmentation, Pose Estimation and Oriented Bounding Box.\n\n## Preparation\n\n### Experimental environment\n\n```Shell\nOS : Windows 11 \nCPU : Intel(R) Core(TM) i7-10750H CPU @2.60GHz 2.59 GHz\nGPU : NVIDIA GeForce GTX 1660Ti 6GB\n```\n\n### 1. Create virtual environment\n\ncreate a virtual environment equipped with python version 3.9, then activate environment. \n\n```shell\nconda create -n yoloshow python=3.9\nconda activate yoloshow\n```\n\n### 2. Install Pytorch frame \n\n```shell\nWindows: pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118\nLinux: pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118\n```\n\nChange other pytorch version in  [![Pytorch](https://img.shields.io/badge/PYtorch-test?style=flat\u0026logo=pytorch\u0026logoColor=white\u0026color=orange)](https://pytorch.org/)\n\n### 3. Install dependency package\n\nSwitch the path to the location of the program\n\n```shell\ncd {the location of the program}\n```\n\nInstall dependency package of program \n\n```shell\npip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple\n```\n\n### 4. Add Font\n\n#### Windows User\n\nCopy all font files `*.ttf` in `fonts` folder into `C:\\Windows\\Fonts`\n\n#### Linux User\n\n```shell\nmkdir -p ~/.local/share/fonts\nsudo cp fonts/Shojumaru-Regular.ttf ~/.local/share/fonts/\nsudo fc-cache -fv\n```\n\n#### MacOS User\n\nThe MacBook is so expensive that I cannot afford it, please install `.ttf` by yourself. 😂\n\n### 5. Run Program\n\n```shell\npython main.py\n```\n\n## Frames\n\n[![Python](https://img.shields.io/badge/python-3776ab?style=for-the-badge\u0026logo=python\u0026logoColor=ffd343)](https://www.python.org/)[![Pytorch](https://img.shields.io/badge/PYtorch-test?style=for-the-badge\u0026logo=pytorch\u0026logoColor=white\u0026color=orange)](https://pytorch.org/)[![Static Badge](https://img.shields.io/badge/Pyside6-test?style=for-the-badge\u0026logo=qt\u0026logoColor=white)](https://doc.qt.io/qtforpython-6/PySide6/QtWidgets/index.html)\n\n## Reference\n\n### YOLO Algorithm\n\n[YOLOv5](https://github.com/ultralytics/yolov5)   [YOLOv7](https://github.com/WongKinYiu/yolov7)  [YOLOv8 / YOLOv11 / RT-DETR / SAM / MobileSAM / FastSAM](https://github.com/ultralytics/ultralytics)  [YOLOv9](https://github.com/WongKinYiu/yolov9)  [YOLOv10](https://github.com/THU-MIG/yolov10) \n\n### YOLO Graphical User Interface\n\n[YOLOSIDE](https://github.com/Jai-wei/YOLOv8-PySide6-GUI)\t[PyQt-Fluent-Widgets](https://github.com/zhiyiYo/PyQt-Fluent-Widgets)\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FYOLOSHOW%2FYOLOSHOW","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FYOLOSHOW%2FYOLOSHOW","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FYOLOSHOW%2FYOLOSHOW/lists"}