Image upload and moderation with Vue.js and Node.js

Almost all applications contain images. This is why I will show you today how to receive and moderate user-submitted images in a Vue.js based application.

We are going to create a small application that lets users upload images and checks whether those images contain nudity. Setting up an automated image moderation system is a good practice for websites and applications that have more than a few tens of submissions per day and need to publish them quickly.

#Architecture

Front-end:

  • Vue.js
  • Axios: A useful module to send an ajax request

Back-end:

  • Node.js
  • Express: Create a web server
  • Multer: A module to manage file uploads
  • Sightengine: The SDK we will use to automatically moderate images

1# Setup the project with Vue.js

npm install -g vue-cli
vue init webpack-simple vuejs-upload
cd vuejs-upload
npm install
npm run dev

You should have a page like this:

2# Deal with images

We will modify the App.vue file to retrieve the image we uploaded with the object formData(), then we will display it with the fileReader() object.

3# Setup the server and upload the images

We will create a backend folder at the root of our project. This is where we will put all our server’s files. After creating this folder, we will use this folder by creating a new package.json specially for the server.

npm init

We will then install the dependencies of our server:

npm install express sightengine multer —- save

You must now create an app.js file:

We will create an html file that hosts the code written in our App.vue file. If we are in production then we must load the static file otherwise we must load the code from a url.

We now have to adapt our App.vue file to send our image to the server.

Now you need to have two terminals: one for the front-end with vue.js that enable with the command npm run dev and another for the server that you enable in the backend folder with the command node app.js

4# Moderate the images with Sightengine

You must register on Sightengine’s website, it’s totally free. Just click on the get started button and create an account.

Once logged in, click on the my account tab. Sightengine offers two keys, we will need these keys to use the SDK in our application. Be careful, do not share them, they are secret keys.

The Nudity endpoint helps you determine if an image contains some kind of nudity and what “level” of nudity it contains. We distinguish three main levels of nudity:

  • Raw nudity, e.g. images containing X-rated content, genitals, breasts etc…
  • Partial nudity, e.g. images containing women in bikini, bare-chested male, cleavages. This content may be OK for some uses and may be inacceptable in other contexts
  • No nudity, also referred to as “safe”

We want in our case to use only the safe level because we do not want any nudity (not even partial).

For each of the 3 classes, Sightengine will return a number between 0 and 1. The higher the number, the higher the confidence that the image belongs to this class.

We will therefore compare the values of each class and validate images that have a ‘safe’ score above the ‘partial_nudity’ and ‘raw_nudity’ scores. If an image is detected as having nudity, we will delete it and return an error message.

We will now adapt the App.vue file to display the information returned by the server.

# Final thoughts

I hope this guide has helped you in the process of upload and moderate images with Vue.js and Node.js. Moderation is a little discussed topic, if you want to know more about Sightengine you can try their demo. Do not hesitate to share and recommend if it helped :)