Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ph1ps/Nudity-CoreML
A CoreML models that detects nudity in a picture
https://github.com/ph1ps/Nudity-CoreML
Last synced: 9 days ago
JSON representation
A CoreML models that detects nudity in a picture
- Host: GitHub
- URL: https://github.com/ph1ps/Nudity-CoreML
- Owner: ph1ps
- License: mit
- Archived: true
- Created: 2017-10-01T11:30:14.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2018-02-15T17:44:51.000Z (over 6 years ago)
- Last Synced: 2024-08-01T13:30:17.355Z (3 months ago)
- Language: Swift
- Size: 478 KB
- Stars: 103
- Watchers: 3
- Forks: 9
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Nudity Detection for CoreML
### Description
This is the OpenNSFW dataset implemented in Apple's new framework called CoreML. The OpenNSFW dataset can predict images as either SFW (safe for work) or NSFW (not safe for work) from images. The model was built with Caffe and is a **fine-tuned Resnet model**.To test this model you can open the `Nudity.xcodeproj` and run it on your device (iOS 11 and Xcode 9 is required). To test further images just add them to the project and replace my testing with yours.
### Obtaining the model
* [Download](https://drive.google.com/open?id=0B5TjkH3njRqncDJpdDB1Tkl2S2s) the model from Google Drive and drag it right into your project folder
* Convert the model on your own:
1. Change directory: `cd ./Convert`
2. Change directory: `sh convert.sh`
### More information
If you want to find out more how this model works and on which data it was trained on, feel free to visit the original OpenNSFW page on [Github](https://github.com/yahoo/open_nsfw)
### Examples
I won't show any NSFW examples for obvious reasons.