Skip to content

sashkopotapov/nsfw-classifier-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

NSFW Content Detection iOS App

This repository contains a sample project demonstrating how to develop and integrate a local NSFW (Not Safe For Work) content detection model for iOS apps. The project uses Apple's CreateML, CoreML, and Vision frameworks.

Overview

The primary goal of this project is to enhance app safety by filtering or blurring inappropriate content while preserving user privacy through on-device processing. This approach is particularly beneficial for end-to-end encrypted (E2EE) applications, where server-side content filtering is not possible.

For a detailed explanation of the project, refer to the Medium article.

Usage

  1. Open the project in Xcode:

    open NSFWClassifierApp.xcodeproj
  2. Run the app on a real iOS device:

    CoreML models do not work in the simulator, so you need to run the app on a real device.

  3. Classify and blur images:

    The app displays a feed of posts with images. NSFW images are automatically blurred using the integrated model.

Contributing

Feel free to submit issues and pull requests for any improvements or additional features.

License

This project is licensed under the MIT License.

Acknowledgements

For more detailed information, refer to the Medium article.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages