How to build an image moderation system with Node.js

Moderation has become a necessity, especially image moderation. There are two ways to perform image moderation: manually and automatically.

In this article we will see how to set up automatic moderation. We will use the NodeJS SDK provided by Sightengine. Sightengine is an API that allows you to moderate image. There are 8 models currently available.

  • Nudity detection (i.e. detect adult or suggestive content)
  • Weapons/Alcohol/Drugs detection
  • Face/People detection (i.e. detect if the image contains a person or several people or nobody, along with the positions of those faces)
  • Face Attributes detection (get the age group and gender, determine if a face is covered with sunglasses…)
  • Celebrity detection
  • Image type detection (i.e. detect if the image is a natural photograph or an illustration)
  • Image Properties (i.e. determine the quality of an image along with its main colors)
  • Scammer detection (detect if an image is likely to have been submitted by a romance scammer)

As a starter, we will restrict ourselves to nudity detection. Here is an example of the JSON returned by the API when you submit the following image:

As you can see, the predicted class is the ‘partial_nudity’ class with a ‘male_chest’ detected.

You must first register on Sightengine’s website, it’s totally free. Just click on the get started button and create an account.

Once logged in, click on the my account tab. Sightengine offers two keys, we will need these keys to use the SDK in our application. Be careful, do not share them, they are secret keys.

Create the application

We will now create a simple application: it will consist of a page with the ability to upload an image and check if this image does not contain nudity. If the image does not contain nudity it will be uploaded otherwise it will be refused.

On an empty project folder, create your project with:

npm init

We will then need to install the modules. Four modules will be needed to create our application.

  • Express: Creates a web server
  • EJS: A template engine to be able to manage dynamic data in HTML
  • Multer: A module that allows to manage file upload
  • Sightengine: The SDK that will allow us to automatically moderate our images
npm install express ejs multer sightengine --save

1# Create the web server with Express

We will create two files at the root of our project: images and styles. They will be used respectively to store images and css files. Once these two folders create, we will create an app.js file in which we will start our server.

To start the server (at localhost:3000) run the following command:

node app.js

2# Create the engine template with EJS

You must create an file called index.ejs and put the following code:

Adapt the server code to integrate EJS:

3# Upload your images with Multer

We will add Multer to our server. The images will be automatically sent to the images folder we created before.

We will also modify our index.ejs file to add a form for file upload.

Last thing, create an app.css file in the styles folder to make our application more beautiful.

4# Moderate the images with Sightengine

The Nudity endpoint helps you determine if an image contains some kind of nudity and what “level” of nudity it contains. We distinguish three main levels of nudity:

  • Raw nudity, e.g. images containing X-rated content, genitals, breasts etc…
  • Partial nudity, e.g. images containing women in bikini, bare-chested male, cleavages. This content may be OK for some uses and may be inacceptable in other contexts
  • No nudity, also referred to as “safe”

We want in our case to use only the safe level because we do not want any nudity (not even partial).

For each of the 3 classes, Sightengine will return a number between 0 and 1. The higher the number, the higher the confidence that the image belongs to this class.

We will therefore compare the values of each class and validate images that have a ‘safe’ score above the ‘partial_nudity’ and ‘raw_nudity’ scores. If an image is detected as having nudity, we will delete it and return an error message.

We will then add the data returned by the server into our index.ejs file.

# Final thoughts

I hope this guide has helped you in the process of moderating a web application. It is a rather complex subject and we have seen only a very small part of it. Sightengine is a very good tool for real-time moderation. You can try their demo.