Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/yjg30737/pyqt_nsfw_image_filter_gui
Filter(remove) the NSFW(not safe for work) images in a certain directory recursively, backup every images before it
https://github.com/yjg30737/pyqt_nsfw_image_filter_gui
image-classification keras nsfw nsfw-classifier nsfw-detection nsfw-recognition pyqt pyqt5 pyqt5-gui python3
Last synced: about 6 hours ago
JSON representation
Filter(remove) the NSFW(not safe for work) images in a certain directory recursively, backup every images before it
- Host: GitHub
- URL: https://github.com/yjg30737/pyqt_nsfw_image_filter_gui
- Owner: yjg30737
- License: mit
- Created: 2023-08-22T07:57:33.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2023-08-22T08:13:52.000Z (about 1 year ago)
- Last Synced: 2024-01-27T20:04:09.793Z (10 months ago)
- Topics: image-classification, keras, nsfw, nsfw-classifier, nsfw-detection, nsfw-recognition, pyqt, pyqt5, pyqt5-gui, python3
- Language: Python
- Homepage:
- Size: 14.6 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# pyqt_nsfw_image_filter_gui
Yup.You need two packages
nsfw-detector and PyQt5
This works every images, not only real photo but paintings such as anime too.
Having the background feature as well, for person who want to handle a lot of images with this
## How to Run
1. clone this
2. pip install -r requirements.txt
3. install [Keras 299x299 Image Model](https://s3.amazonaws.com/ir_public/ai/nsfw_models/nsfw.299x299.h5) and put it into src folder
4. python main.py## Preview
![image](https://github.com/yjg30737/pyqt_nsfw_image_filter_gui/assets/55078043/f19099ec-32fc-49af-8552-2db195801935)### How to Use
1. Set the directory which contains image files.
2. Press the filter button to remove the NSFW images.## How this works
This is using [Keras 299x299 Image Model](https://s3.amazonaws.com/ir_public/ai/nsfw_models/nsfw.299x299.h5), and this is influenced by nsfw_model also known as nsfw-detector. This is pretty much GUI version of it.Personally i categorized nsfw level as three, "nsfw", "semi-nsfw", "safe" based on certain standard.
## Note
You can use this in CUI(Python code) only. Look at the script.py to figure it out how it works.This really helps me a lot from encountered embarrassing moment from someone who checks my images which stored by crawling 🙂
By the way i'm not sure i can upload the even slightly controversial image to explain how this works so i won't do it.
Please try it !!