https://github.com/trumanwong/ComfyUI-NSFW-Detection
This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.
https://github.com/trumanwong/ComfyUI-NSFW-Detection
Last synced: 6 months ago
JSON representation
This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.
- Host: GitHub
- URL: https://github.com/trumanwong/ComfyUI-NSFW-Detection
- Owner: trumanwong
- License: mit
- Created: 2024-02-02T10:16:57.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-08-03T05:16:33.000Z (11 months ago)
- Last Synced: 2024-08-04T05:44:14.656Z (10 months ago)
- Language: Python
- Size: 514 KB
- Stars: 20
- Watchers: 1
- Forks: 5
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-comfyui - **ComfyUI-NSFW-Detection**