Set up a NodeJS API with CI on Gitlab for your web/mobile app (Express/Jasmine/PostgreSQL/Docker) part-2/3

Charles Touret
Jan 8 · 9 min read
Image for post
Image for post

Introduction

To see the first article you can click there ;)

In this document I will explain how to build a rest API with automatic tests on managed by Gitlab-CI. At the end of this series of documents you will be able to build your own API and especially to maintain it ! You will also be familiar with docker and the node testing framework Jasmine. The API that you will create can be used for your web application or/and for your mobile application.

The more you test the more you prevent bugs

If we add code to our routes API, we might break things and maybe connect a user who entered bad credentials ! It can have big consequences if we can connect to any account with bad password, so it’s important to be sure to always test that it is working as we expect when we update the code !

First I will show you how to write a simple test with the framework Jasmine.

Create tests locally with Jasmine

Now that we have write our first API route (in the previous episode) let’s write our first test with the javascript framework jasmine !

To write an interesting test, let’s add a column ‘password’ to our table users, you are not supposed to save a no crypted password in your database but we will do it for the example !

We create a column ‘password’ in the users table (see previous article to learn how to use the postgres CLI )

ALTER TABLE users ADD COLUMN password text;

We add the password ‘user1password’ to user1

UPDATE users set password='user1password' where id = 0;

Now we create a login route in our API

So first add these functions in queries.js

const verifyCredentials = (email,password) => {
const values = [email, password]
const queryPromise = new Promise((resolve, reject) => {
const query = 'SELECT * from users WHERE email=$1 AND password=$2'
db.query(query, values, (error, results) => {
if (results.rows.length == 1) {
resolve(true)
}
resolve(false)
})
})
return queryPromise
}
const login = (request, response) => {
// user login informations
const user = {
email: request.body.email,
password: request.body.password
}
verifyCredentials(user.email, user.password)
.then( (connected) => {
if (connected) {
// connection is validate, send a 200 status code by default
return response.json({
message: 'connected',
})
}
// else connection refused, we send an authentification error
return response.status(401).json({
message: 'unauthorized',
})
})
}

Don’t forget to export the function at the end of the file

module.exports = {
getUsers,
login
}

And then create the login route in index.js

app.post('/login', queries.login)

To allow your nodeJS application to read the body of a HTTP request you need to use the node package ‘body-parser’, do like follow in index.js

...
const bodyParser = require('body-parser');
// create new express app and save it as "app"
const app = express();
app.use(bodyParser.json({ limit: '50mb' }))
...

We can now request our API with postman to see if it is working well !
our API is supposed to return a 200 status code, with a body message ‘connected’ if it’s worked, and a 401 status code if not.
Let’s test it using postman (to learn how to use postman (4 minutes) go to this article).
We can see that with good credentials it’s return the right things, and with bad credential it’s return the 401 status code !

Image for post
Image for post
working request, we can see the message connected in the body response
Image for post
Image for post
not working request, we can see the message unauthorized in the body response

Let’s set up Jasmine in our project !

simply launch at the root of your project :

npm install jasmine-node nyc

Then create a folder spec at the root and create inside a file called test.spec.js.

Your file architecture must be like this :

Image for post
Image for post

In test.spec.js copy paste this code :

describe('TestDb ', function () {    it('how many users have I in the database ? should response with a length of 3', (done) => {
expect(1).toEqual(1)
done()
})
})

Now launch your new created test :

npx jasmine spec/*.js

you should see in your terminal something like ‘1 spec, 0 failures’

Well done you created your first test !

OK but how can we test an API route ? This is what we will see below, we need first to dockerize our application to do it.

Dockerize the application to launch API test

You can go on this article if you don’t know Docker, but if you already work with it, you will well understand the following explanations.

Setup docker in our project :

if you are on Linux you can follow this short tutorial

In our project we are using 2 services :

  • postgres
  • nodeJS (our docker image of our app created by the Dockerfile)

So first let’s create our dockerFile to build an Image of our nodeJS application :

FROM node:12.19.0
WORKDIR /app
COPY . /app
RUN npm install
EXPOSE 8080
  • The FROM key word signals where to find the node image which is the basic image of our future image create by this dockerfile .Docker take the node image from the dockerhub.
  • WORKDIR /app set the working directory to /app in our new image
  • COPY . /app copy all filles in the local curent directory to the folder app in the new image
  • RUN npm install will install all libraries used by our node program
  • EXPOSE expose will expose the port 8080 of the container in which our image will run

After saving the file at root of the project, build the image by enter this in the terminal :

docker build -t myimage .

We need to give our database to the container which will run postgres.

So we first need to dump our database, in a terminal enter the following command :

pg_dump --create --clean -U postgres medium > /home/your_user_name/Desktop/medium.sql

— create will add the CREATE DATABASE statement and — clean will DROP the database if it already exists.

Open the file medium.sql if you are curious !

Now we have the dump medium.sql on our Desktop, let’s put this file on the root of our project, so we now have this file architecture.

Image for post
Image for post

I will explain below how the docker-compose file is structured and how we will use the dump.

Let’s create our docker-compose file :

version: "3"
services:
myapp:
image: myimage
container_name: myimage
depends_on:
- postgres
ports:
- "8080:8080"
networks:
- myNetwork
command: "node index.js"
postgres:
image: postgres:13.1
container_name: postgres
environment:
- POSTGRES_HOST_AUTH_METHOD=trust
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
volumes:
- ./databases:/home/dumps/tests
ports:
- "5432:5432"
networks:
- myNetwork
networks:
myNetwork:

We can see that we have create a network ‘myNetwork’, and each services use this network. With networks, services can communicate by putting the image name as IP adress, I will explain you this below don’t worry ;).

We can also notice that we expose ports of our services, (8080:8080 and 5432:5432).

And we add our medium dump in volumes of the service postgres

- ./databases:/home/dumps/tests

This line means that we copy the folder database which is at the root of our project in our container at the path /home/dumps/tests.

If you not really understand the docker explanations I strongly recommend you to go check for my article on docker.

We can verify that all is working by launching our docker-compose file and go check in postgres container at the path /home/dumps/tests if there is our dump.

So at the root of the project :

docker-compose up -d
docker ps

you normally see your 2 containers like follow :

Image for post
Image for post

after we enter in our container postgres to go check in the path :

docker exec -it postgres bash
cd home/dumps/tests
ls

And you will see medium.sql

Problems that you will maybe encounter when launching you docker-compose

  • when you launch your docker-compose file, stop the postgres service that is running locally, because we mapped the postgres container on 5432, so if the port is already took by local postgres it will bug, to stop you local postgres enter in a terminal :
sudo service postgresql stop

and after relaunch your docker-compose -> docker-compose up

It means that all is working well !

Like I say previously, with networks in dockercompose, containers communicate by putting container names instead of their IP adress, so to make our app work we need to replace the IP localhost by the container name postgres in our config file of the database,

modify the file db_config.js:

// config
// PostgreSQL config
const configPG = {
user: 'postgres',
host: 'postgres',
database: 'medium',
password: 'postgres',
port: '5432'
}
const Pool = require('pg').Pool
const db = new Pool(configPG)
module.exports = { db }

You modify your code, so you need to rebuild your image :

docker build -t myimage .

relaunch your docker-compose

docker-compose up -d

we need to load our database dump in postgres container :

docker exec postgres psql -U postgres -h postgres -f /home/dumps/tests/medium.sql

go at http://localhost:8080/users

and if you see your datas, it has worked and you app now run in docker !

Launch API test on Docker

stop the docker-compose containers

docker-compose down

Let’s test our login route in docker !

To do it we need to write tests on login route, so copy paste the following code in the file test.spec.js, it is same tests that we’ve done manually on the first part, but we now do them programatically.

var request = require('request')
var baseUrl = 'http://localhost:8080'
// authentification routes
describe('login ', function () {
it('login function should log the user in and return 1 argument msg connected and 2 cookies and status code 200', async function (done) {
var User = {
email: 'user.un@test.com',
password: 'user1password',
}
options = {
method: 'POST',
url: baseUrl + '/login',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(User)
}
request(options, function (error, response) {
if (error) throw new Error(error)
const responseLength = Object.keys(JSON.parse(response.body)).length
expect(responseLength).toBe(1)
expect(JSON.parse(response.body).message)
.toEqual('connected')
expect(response.statusCode).toBe(200)
done()
})
})
it('login with a bad password, should fail, return a 401 status code and 1 argument msg unauthorized', async function (done) { User = {
email: 'user@dixhuit.test',
password: 'wrongpassword'
}
options = {
method: 'POST',
url: baseUrl + '/login',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(User)
}
request(options, function (error, response) {
if (error) throw new Error(error)
const responseLength = Object.keys(JSON.parse(response.body)).length
expect(responseLength).toBe(1)
expect(JSON.parse(response.body).message)
.toEqual('unauthorized')
expect(response.statusCode).toBe(401)
done()
})
})
})

describe('TestDb ', function () {it('how many users have I in the database ? should response with a length of 3', (done) => {
expect(1).toEqual(1)
done()
})
})

You can notice that you need to install the library request

so :

npm install --save request

So once again save your work and rebuild your image (I will show you in the next article how to automatize this process when launching tests)

docker build -t myimage .

After, launch your docker-compose

docker-compose up

And after launch your tests in a terminal

npx jasmine spec/*.js

If you see ‘3 specs, 0 failures’, then well done you lanched your API tests in docker, this is a big step !

In the part 3 we will see how to automate the tests at each push by creating pipelines on Gitlab ! The part 3 is coming next week so stay tuned.

WantedOffice

WantedOffice is a new way of work, work spaces in private homes, for teleworkers and companies.

Charles Touret

Written by

WantedOffice

Businesses cut costs and teleworkers feel more productive with a change of scenery. This publication will talk about remote work, new working practices, employee well-being and there will also be some technological publications about how we are working within WanteOffice.

Charles Touret

Written by

WantedOffice

Businesses cut costs and teleworkers feel more productive with a change of scenery. This publication will talk about remote work, new working practices, employee well-being and there will also be some technological publications about how we are working within WanteOffice.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store