NudeNet: An ensemble of Neural Nets for Nudity Detection and Censoring

Praneeth Bedapudi
Mar 30, 2019 · 3 min read

Note: This post can also be read from here

Part 2: Exposed part detection and censoring.

The Why: A major draw back of image classification is not having fine grained control. If we want to blur the exposed parts/ have fine -grained control of which type of images we want to allow, we need Object Detection. This is a first of it’s kind open-source project, which I hope will be helpful to the community.

If some one wants to allow images with exposed chest or buttocks but not images with exposed genitalia, or some other combination, doing this solely with image classification is not possible. Since, there are no creative ways of obtaining the data-set for this task, I make use of the data collected by Jae Jin and team. For more information on this or to contribute to the development of the dataset, please contact Jae Jin or join their team’s discord server (Discord Tags: 0131#2628 or Jae Jin#0256).

With a total of 5789 images, the distribution of the number of labels is as follows

Image for post
Total data available for training

With this data, I use the image augmentation library albumentations for adding some random blur, flips etc to the dataset.

Since, there is significant class imbalance in the data, I chose to use RetinaNet by FAIR for object detection. RetinaNet uses a variation of cross entropy loss called Focal Loss, which is designed to increase the performance of one-stage object detection.

Using, resnet-101 as the backend, the model achieves the following scores on the test data.

Image for post

Evaluating the model:

Since, this model can be used for both nudity detection and censoring, I test the model with the same data I used to test the classifier. If in an image, any of the labels “BUTTOCKS_EXPOSED”, “*_GENETALIA_EXPOSED”, “F_BREAST_EXPOSED” are found, we label the image as “nude” and if none of these are found, we label the image as “safe”. This labels mapping is chosen based on test data. For eg: In the test dataset, images with exposed Male Breast or exposed Belly, are present in the “sfw” category.

Image for post
Precision and Recall of NudeNet’s Detector

NudeNet’s Detector performs better than Yahoo’s Open NSFW, GantMan’s nsfw_model and NudeNet’s classifier in identifying porn.

I also, include a censor function in this class, which censor’s the private parts in a nsfw image.

For example, for the image (NSFW), the following is the output of NudeNet’s censor function.

Image for post
Censored image, using NudeDetector.

The project can be found at

The pre-trained models at

To install and use NudeNet, take a look at the following snippet.

# installing the project
pip install git+
# Using the classifier
from NudeNet import NudeClassifier
classifier = NudeClassifier('classifier_checkpoint_path')

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store