Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lovoo/NSFWDetector
A NSFW (aka porn) detector with CoreML
https://github.com/lovoo/NSFWDetector
cocoapods coreml ios ios-library lovoo machine-learning swift
Last synced: 3 months ago
JSON representation
A NSFW (aka porn) detector with CoreML
- Host: GitHub
- URL: https://github.com/lovoo/NSFWDetector
- Owner: lovoo
- License: bsd-3-clause
- Created: 2018-08-14T09:21:41.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2022-06-10T13:56:41.000Z (over 2 years ago)
- Last Synced: 2024-04-24T19:02:10.123Z (6 months ago)
- Topics: cocoapods, coreml, ios, ios-library, lovoo, machine-learning, swift
- Language: Swift
- Homepage:
- Size: 264 KB
- Stars: 1,561
- Watchers: 49
- Forks: 107
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-ios - NSFWDetector - A NSFW (aka porn) detector with CoreML. (Media / Media Processing)
- awesome-deepfake-porn-detection - github
- awesome-ios-star - NSFWDetector - A NSFW (aka porn) detector with CoreML. (Media / Media Processing)
README
# ![NSFWDetector](https://github.com/lovoo/NSFWDetector/blob/master/assets/header.png?raw=true)
[![Version](https://img.shields.io/cocoapods/v/NSFWDetector.svg?style=flat)](https://cocoapods.org/pods/NSFWDetector)
[![License](https://img.shields.io/cocoapods/l/NSFWDetector.svg?style=flat)](https://cocoapods.org/pods/NSFWDetector)
[![Platform](https://img.shields.io/cocoapods/p/NSFWDetector.svg?style=flat)](https://cocoapods.org/pods/NSFWDetector)NSFWDetector is a small (**17 kB**) CoreML Model to scan images for nudity. It was trained using CreateML to distinguish between porn/nudity and appropriate pictures. With the main focus on distinguishing between instagram model like pictures and porn.
## Usage
```swift
guard #available(iOS 12.0, *), let detector = NSFWDetector.shared else {
return
}detector.check(image: image, completion: { result in
switch result {
case let .success(nsfwConfidence: confidence):
if confidence > 0.9 {
// 😱🙈😏
} else {
// ¯\_(ツ)_/¯
}
default:
break
}
})
```If you want to enforce stricter boundaries for your platform, just apply a lower threshold for the confidence.
## Installation
### Swift Package Manager
```swift
dependencies: [
.package(url: "https://github.com/lovoo/NSFWDetector.git", .upToNextMajor(from: "1.1.2"))
]
```### Cocoapods
```ruby
pod 'NSFWDetector'
```⚠️ Because the model was trained with CreateML, you need **Xcode 10** and above to compile the project.
## App Size
The Machine Learning Model is only **17 kB** in size, so App size won't be affected compared to other libraries using the [yahoo model](https://github.com/yahoo/open_nsfw).
## Using just the Model
If you don't want to use the Detection Code, you can also just download the MLModel file directly from the latest [Release](https://github.com/lovoo/NSFWDetector/releases).
## Feedback
If you recognize issues with certain kind of pictures, feel free to reach out via [Mail](mailto:[email protected]) or [Twitter](https://twitter.com/LOVOOEng).
## Author
Michael Berg, [[email protected]](mailto:[email protected])
## License
NSFWDetector is available under the BSD license. See the LICENSE file for more info.