Building a simple expense tracking API Part 1 — Setting it up

Using Knex.js Objection.js Restify and ES7

Pritoj Singh
16 min readJul 1, 2018

My hunt for a good node ORM started when I decided to ditch the dumpster fire that is Sailsjs and Waterline with it. This is the story of one such ORM called Objection.js. It has been great so far. In this part, I would like to just setup the basic stuff, the ORM, the login etc. to get a basic understanding of the ORM. In part 2 we’ll move on to more complex stuff.

What are we building? We will be building a simple-ish spends tracking application.

The motivation for this is simple, I want to provide a non-trivial API example which can be used a a template for complex projects. When I was starting my journey I really ached for project examples which explained the decisions made. This is my attempt to rectify the situation.

All the code can be found on github.

We start with our trusty npm init. Enter all your details.

Now we install our three important packages

  • Knex.js: A very powerful SQL query builder. It’ll help us create and maintain migrations. It’ll also help our ORM Build it’s queries. We’ll also use the knex-cli to create and manage our migrations
  • Objection.js: This is our ORM. It’ll help us interface with the db in an easy manner

One command to install them all?
Run this in your terminal npm install knex objection
To install knex-cli run npm install -g knex

Well now that we have the packages installed, let’s get to work!

First we create a knexfile.js. This file helps the knex-cli connect to your db and provides a few other options.

Run knex init to create a knexfile.js. By default it should look something like this

You can use any database you want. For simplicity sake, we’ll use sqlite3. Now that your database is setup, it’s time to create your first migration.

Migrations

I mean, these have been a touchy topic ever since Trump came to power, but oh well, those are not the migrations we’ll talk about here.

Before we proceed to migrations, a bit about them. Migrations are just a way to keep your db structure in sync. So instead of having a bunch of spaghetti SQL, you have a bunch of files. Since db structure is deterministic, you’ll always end up with the same structure.

Thanks to knex-cli you can manage migrations fairly easily. Before we actually create the migrations, let’s first create a directory to house them. You must edit the knexfile.js to add a migrations directory.

development: {
client: 'sqlite3',
useNullAsDefault: true,
connection: {
filename: './dev.sqlite3'
},
migrations: {
directory: './db/migrations'
}

},

Please also create the db/migrations directory in your project root. Now all you have to run is the following command knex migrate:make users. What we are asking knex to do is, create a migration with the name users. Once you do that you’ll get a file in the migrations directory ending with users.js.

You’ll see something like this in the file

exports.up = function (knex, Promise) {

};

exports.down = function (knex, Promise) {

};

This is the default template knex creates for you. the up function will help you run the migration. In this function we make changes which take us to our desired db state. In our case, we just want to create a db of users. so we’ll go ahead and do that now.

exports.up = function (knex, Promise) {
return knex.schema.createTable('users', (t) => {
t.increments();

t.string('username');

t.string('email');

t.string('password');

t.timestamps();
});

};

exports.down = function (knex, Promise) {

};

Once you add those lines, whenever the migration is run, knex creates these users. You maybe wondering what those functions are and what Voodoo this is, so if you are, I suggest you head to the Knex.js website’s Schema Building Section.

Now for the down function. The down function is run when you want to undo a migration. The idea generally is that you can go up or down a migration. Unfortunately knex doesn’t have that. However it does allow you to rollback all your migrations. So this function is helpful with undoing a migration. In our case, we’ll just ask it to drop the table.

exports.up = function (knex, Promise) {
return knex.schema.createTable('users', (t) => {
t.increments();

t.string('username');

t.string('email');

t.string('password');

t.timestamps();
});
};

exports.down = function (knex, Promise) {
return knex.schema.dropTable('users');
};

Yay! you made your first migration in knex. Time to run it. If everything is correct then running the migration with knex migrate:latest should be successful and you should get a message along the lines of the one below.

Using environment: development
Batch 1 run: 1 migrations
/path/to/your/project/db/migrations/20180628184931_users.js

Congratulations, you’ve run your first migration. For the remainder of this part, we’ll only need this migration. In part 2 we’ll create more migrations.

Models

Models are HOT,

Models are IN,

Models have been everywhere you haven’t been. (I’m sorry but I must force jokes in to look cool)

What is a model? In the most basic sense it’s a description of how your db should look like. Or rather how one part of your db should look like. In our case, it’s a ES7 understanding of how your Users table looks like and how to interact with it, so that you can then, not write SQL, but instead use javascript to interact with your db. It’s been called a lot of things, like ORM, Schema, Models. Doesn’t matter. Basic stuff is the same.

Objection.js helps you do that. How you ask, well, let me show you.

Before we create the model, we need to set up objection. Essentially, Objection uses Knex.js in the background. So we must setup knex and then pass it to objection. First create a file in the db folder called index.js. Add the following to it.

const { Model } = require('objection');
const Knex = require('knex');

// Initialize knex.
const knex = Knex({
client: 'sqlite3',
useNullAsDefault: true,
connection: {
filename: 'example.db'
}
});

// Give the knex object to objection.
Model.knex(knex);

module.exports = {
knex,
Model
}

Let’s see one by one, what’s happening here. First we just setup knex. then using the .knex method which accepts (Guess Whaaaaat?) a knex instance, we pass the created knex to Objection. This is vital to objection because otherwise Objection can’t do anything. It’s important to realize that Objection uses knex to actually write the SQL. Objection is just an abstraction over the QueryBuilder used in Knex.

Then we export the model to which knex was passed. Now technically, we don’t need to use this particular model object but I do.

Next we create the db/models/Users.js file. This file will contain the actual model. For now it’ll only require the table name. Populate it with the following

const { Model } = require('../index');

class Users extends Model {
static get tableName() {
return 'users';
}
}

Great! Now you have a model! Do you find it hot? (It doesn’t mind being called it since it’s an Object.. Get it ;) )

Now that you have this basic setup, time to build a route.

Before we do that, we setup the bin/www with this

const restify = require('restify');
const bunyan = require('bunyan');

const log = bunyan.createLogger({ name: "Spendster Backend" });

// Create the resitfy server
const server = restify.createServer({
name: 'Spendster Backend',
log
});


server.listen(process.env.PORT || 3000);

This doesn’t do much. Just simply creating the restify server. Also run yarn add bunyan.

Now create app.js in your project root and fill it with the following

const restify = require('restify');

/**
* This adds parsers and routes to the server.
* Basically bootstraps te app
* @param {Restify} server instance
*/
module.exports = (server) => {
// Add all the frigging parsers
server.use(restify.plugins.queryParser());
server.use(restify.plugins.bodyParser());
};

You may have noticed how this exports a function. Technically we could put this stuff in the bin/www file and it would still work. However, this helps create some separation. While the bin/www cares about the nitty gritty of how the server runs, like the port and logging etc., this file configures how the server works internally.

For now, this server has added all the parsers. Let’s talk parsers for a minute.

Parsers

A parsers is simply a tiny function which helps you make sense of the inout you receive. For example, say you receive a message in morse code. You write a function to convert the morse code to it’s english translation. Then you add a dictionary on top of it to make sure whatever you’re getting is valid and not gibberish. That’s how a parser do.

e.g. The bodyParser parses the body. Say you send a request with the following body

{ "whoIsAmazing" : "You are dummy!" }

You know this is JSON. I know this is JSON. But does the server know? I think not! the server reads it as a string. It’s not an object yet. What the server sees is this

"{ \"whoIsAmazing\" : \"You are dummy!\" }"

What does the body parser do? it basically runs JSON.parse on it. I/m not sure if it actually does that, but for understanding it’s a good anough mental model.

Now that you’ve set up the parsers, time to pass it the server.

Edit the bin/www file to this

const restify = require('restify');
const bunyan = require('bunyan');

const app = require('../app');

const log = bunyan.createLogger({ name: "Spendster Backend" });

// Create the resitfy server
const server = restify.createServer({
name: 'Spendster Backend',
log
});

// Pass it to app for further config
app(server);




server.listen(process.env.PORT || 3000);

And that’s more or less it. Your server is setup. Now all you need to do is add the routes.

Routing

Routes are simple the urls which you will hit. Generally you can just add a route in the app.js file like so

function respond(req, res, next) {
res.send('hello ' + req.params.name);
next();
}

server.get('/hello/:name', respond);

While this is okay for example projects, managing this can become really difficult really fast. So instead, we use a router called restify-router. Essentially what it allows me to do, is separate my routes into smaller files. So now you should go ahead and create a routes folder.

In that folder we’ll add our first route. Add a file called routes/user.js. Fill the file with the following

/**
* This file contains all the user routes
*/

const Router = require('restify-router').Router;
const routerInstance = new Router();

// This is just a test route
routerInstance.get('/hello', (req,res) => {
res.json({'hello':'world'});
});

module.exports = routerInstance;

Now create a file routes/index.js. This is the file which will bring all your routes together. Fill it with this

/**
* This function imports all the routes and adds them
* to the server provided.
* @param {Restify} server instance
*/

const userRoutes = require('./user');

const addRoutes = (server) => {
// Add the user routes
userRoutes.applyRoutes(server);
}

module.exports = addRoutes;

This file just exports a function which takes the server as input and applies the routes to it. So we just import the routerInstance exported from the routes/user.js file and apply those routes to the passed in server instance.

After this just import the routes to your app.js file and call the addRoutes function to add these routes.

const restify = require('restify');

const addRoutes = require('./routes');

/**
* This adds parsers and routes to the server.
* Basically bootstraps te app
* @param {Restify Server Instance} server
*/
module.exports = (server) => {
// Add all the frigging parsers
server.use(restify.plugins.acceptParser(server.acceptable));
server.use(restify.plugins.queryParser());
server.use(restify.plugins.bodyParser());

// Add all the routes
addRoutes(server);
};

Now you can start the server using node /bin/www and open your browser to localhost:3000/hello. You should see output like this

{'hello':'world'}

Congratulations! You’ve got the server running!

All the code up to this point is available here.

It can get boring to keep doing running the node command. SO just add it as a script to the package.json file like this

{
"name": "spendster-backend",
"version": "0.0.1",
"description": "Spend tracking app",
"main": "index.js",
"scripts": {
"start": "node ./bin/www",
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Pritoj Singh",
"license": "ISC",
"dependencies": {
"bunyan": "^1.8.12",
"knex": "^0.14.6",
"objection": "^1.1.10",
"restify": "^7.2.1",
"restify-router": "^0.5.0",
"sqlite3": "^4.0.1"
}
}

Phew! That was a lot of stuff. Take a breather.

Create a user

Now that you have a running server, time to add a way for the user to be created. For this we create a handler first. Create a file handlers/Users.js. In this file add the following

const Users = require('../db/models/Users');

const addUser = async (username, email, password) => {
// Hash the password for storage

// Add the user.
let user = await Users.query().insert({
username,
email,
password: passwordHash
});

// return the user object
return user;
};

module.exports = {
addUser
};

As you can see, we don’t have a way to hash the password yet. We’ll use bcrypt to hash passwords before storage. Run yarn add bcrypt. Create a helper file called helpers/crypto.js and fill the file with the following

const bcrypt = require('bcrypt');

/**
* This function uses bcrypt to hash string.
* Mostly used for passwords.
*
* @param {String} str
*/
const hashString = async (str) => {
return bcrypt.hash(str,10);
}

module.exports = {
hashString
};

This simple function can be used to hash a string. For more info on why we use bcrypt please check here.

Now add the following to the handlers/Users.js file.

const Users = require('../db/models/Users');
const cryptoHelper = require('../helpers/crypto');

const addUser = async (username, email, password) => {
// Hash the password for storage
let passwordHash = await cryptoHelper.hashString(password);

// Add the user.
let user = await Users.query().insert({
username,
email,
password: passwordHash
});

// return the user object
return user;
};

module.exports = {
addUser
};

Now you have a method to add the user. But how will you call it. Fear not! open up routes/user.js file. Remove the earlier url we had added.

Edit the file to look like this

/**
* This file contains all the user routes
*/

const Router = require('restify-router').Router;
const { InternalServerError } = require('restify-errors');

const routerInstance = new Router();
const userHandler = require('../handlers/Users');

// This is a route to add new users
routerInstance.post('/', async (req,res,next) => {
// Extract from the request body the required params
const {
username,
email,
password
} = req.body;

req.log.info('Adding new user');

try {
// Insert the user record
let userInsert = await userHandler.addUser(username, email, password);
req.log.info(userInsert);
res.json({success:true});
}
catch(e){
req.log.error(e);
next(new InternalServerError());
}



});

module.exports = routerInstance;

What are we doing? Well we are extracting the required params from the request body and calling the addUser handler we just we just wrote.

Try using postman to send this request. Make sure you set the content type as application/json or this won’t work as intended. If everything is done properly, this should return {success:true} with statusCode 200.

All code till this point is available here.

Input Validation

You may have noticed, if you send bad data or don’t send data to your new fangled route, the server shuts down un expectedly. This is sub-optimal. We as programmers, must assume that the user is stupid. Even the API user. So we can’t let any input shut us down.

First we’ll add a universal catch-all in app.js. Edit the file as shown below

const restify = require('restify');

const addRoutes = require('./routes');

/**
* This adds parsers and routes to the server.
* Basically bootstraps te app
* @param {Restify Server Instance} server
*/
module.exports = (server) => {
// Add all the frigging parsers
server.use(restify.plugins.acceptParser(server.acceptable));
server.use(restify.plugins.queryParser());
server.use(restify.plugins.bodyParser());

server.on('restifyError', function (req, res, err, next) {
// handle all errors passed to next here, whether it's Error or NotFoundError or anything that is an instance of Error
res.status(err.status || 500);
res.json(err.errors);
});


// Add all the routes
addRoutes(server);
};

This is good for all erros, but not very useful to the user. How do we fix this scenario? We add input validation to our routes.

We’ll be using restify-api-validation along with Joi schemas. If you don’t know what those are, don’t worry. You’ll understand soon. SO let’s install these packages by running

yarn add restify-api-validation joi

Now edit your routes/user.js file to add some validation to it.

/**
* This file contains all the user routes
*/

const Router = require('restify-router').Router;
const { InternalServerError } = require('restify-errors');
const validate = require('restify-api-validation');
const Joi = require('joi');
const _ = require('lodash');


const routerInstance = new Router();
const userHandler = require('../handlers/Users');

// This is a route to add new users
routerInstance.post(
'',
validate(
{
body: {
username: Joi.string().required(),
email: Joi.string().required().email(),
password: Joi.string().required()
}
}
),

async (req,res,next) => {
// Extract from the request body the required params
const {
username,
email,
password
} = req.body;

req.log.info('Adding new user');

try {
// Insert the user record
let userInsert = await userHandler.addUser(username, email, password);

// Omit the password field
userInsert = _.omit(userInsert,['password']);
req.log.info(userInsert);
res.json(userInsert);
}
catch(e){
req.log.error(e);
next(new InternalServerError('Couldn\'t create user'));
}



});

module.exports = routerInstance;

As you can see from the validate block, we have added the schema to it. This is how Joi is helpful. It helps define some schema and then check if a supplied object is valid. If not, it returns some errors.

Now if you start your server and try to send a request to the create route with the following input

{
"email": "pritojsgmail.com",
"password": "12345"
}

You’ll get and error with the following response body and statusCode 400 BadRequestError

[
{
"field": [
"username"
],
"location": "body",
"messages": [
"\"username\" is required"
],
"types": [
"any.required"
]
},
{
"field": [
"email"
],
"location": "body",
"messages": [
"\"email\" must be a valid email"
],
"types": [
"string.email"
]
}
]

This as you can see is much more helpful to the user.

User Login

Now that we have this set up, the only thing left is to add a way for the user to login. Now that we have the basic setup, time to add the login route. For login we’ll use passport as our authentication library. This is like the gold standard for node.js auth. Passport was written for Express originally, so we can use passport-restify package instead.

Let’s start by creating an auth.js file in the root.

Strategies

Passport uses the concept of strategies for authentication. A strategy is simply a piece of code which helps us authenticate a request (or anything really). So the passport-facebook strategy helps users login via facebook. Similarly the passport-local strategy helps with local auth, i.e. with a username and a password. We’ll be using the LocalStrategy now.

First install it by running yarn add passport passport-local.

Fill in your auth.js file with the following

const passport = require('passport-restify');
const LocalStrategy = require('passport-local').Strategy;
const {
NotFoundError,
UnauthorizedError
} = require('restify-errors');

const Users = require('./db/models/Users');

// Defining strategy to authenticate with email
// and password.
passport.use(new LocalStrategy((async (username, password, done) => {

// Everything is amaze, move ahead
return done(null, user);
})));

module.exports = passport;

So what does this strategy do? It simple looks at your request body. It extracts the username and the password which is then passed to your function.

Now our application uses email instead of username. So we add the follwing config while creating the strategy to ask it to use the email field

passport.use(new LocalStrategy({
usernameField: 'email'
}
,(async (email, password, done) => {

Now instead of looking for the username field in the request body, it’ll look for the email field.

Once we have the email and password, we try to check if the credentials provided are correct. We’ll find a user with the given email. If we can’t find one, we’ll throw a 404.

passport.use(new LocalStrategy({
usernameField: 'email'
},(async (email, password, done) => {


// Find user
let user = await Users.query().findOne({email});
if(!user) {
// no user was found, throw 404
return done(new NotFoundError());
}

// Everything is amaze, move ahead
return done(null, user);
})));

module.exports = passport;

Once we have a user we’ll compare if the password matches. For this we’ll use the compare function provided by bcrypt. Edit the helpers/crypto.js file to add the following function

/**
* This function checks if the given string and the
* hashed strings are the same.
*
* @param {String} str The string to be checked
* @param {String} hash The encrypted string
*/
const compareHash = async (str, hash) => {
return bcrypt.compare(str,hash);
}

Don’t forget to export it from the file by adding it to module.exports.

Edit the auth.js file as follows

const passport = require('passport-restify');
const LocalStrategy = require('passport-local').Strategy;
const {
NotFoundError,
UnauthorizedError
} = require('restify-errors');

const Users = require('./db/models/Users');
const cryptoHelper = require('./helpers/crypto');

// Defining strategy to authenticate with email
// and password.
passport.use(new LocalStrategy({
usernameField: 'email'
},(async (email, password, done) => {


// Find user
let user = await Users.query().findOne({email});
if(!user) {
// no user was found, throw 404
return done(new NotFoundError());
}

// Check if the password matches
let passwordCheck = await cryptoHelper
.compareHash(password, user.password);

if(!passwordCheck) {
return done(new UnauthorizedError());
}


// Everything is amaze, move ahead
return done(null, user);
})));

module.exports = passport;

While this is fine and dandy, you maybe wondering how it will ever be called. Don’t worry. In your handlers/routes file add the following route

// This is a route to get a login token via username and password
routerInstance.post(
'/token',
validate(
{
body: {
email: Joi.string().required().email(),
password: Joi.string().required()
}
}
),
passport.authenticate('local',{session:false}),
async (req,res,next) => {

res.json(req.user);
}
);

You must have noticed how we’ve added passport.authenticate before our actual handler. This is how Restify and even Express works. You can add as many functions with the signature (req,res,next) to the route as you want. passport.authenticate returns just such a function. If the function runs successfully, it moves on to the next handler, which in our case is where the actual magic happens.

The 'local' in our auth call tells passport which strategy to use and {session:false} tells it to not create a session with this authentication method. We don’t want sessions, we’ll just REST ;)

In the response you may see that even the passport field with the hashed field is being returned. So let’s remove that first. Install lodash by running yarn add lodash. If you don’t know what lodash is, it’s just a simple library of common helpful functions.

Now edit the route like so

const _ = require('lodash');// This is a route to get a login token via username and password
routerInstance.post(
'/token',
validate(
{
body: {
email: Joi.string().required().email(),
password: Joi.string().required()
}
}
),
passport.authenticate('local',{session:false}),
async (req,res,next) => {
// Check the user data sent, remove the password
let userData = _.omit(req.user,['password']);
res.json(userData);
}
);

Well now instead of sending the info back to you, you want to send the api something with which it can make further requests. This is the user token. To create it we’ll use the jsonwebtoken library. Add it by running yarn add jsonwebtoken.

Now edit your helpers/crypto.js file to add jwt creation and export the function.

const jwt = require('jsonwebtoken');

const generateWebToken = (data) => {
return new Promise((resolve,reject) => {
try {
// Try if the signing works.
let token = jwt.sign(JSON.stringify(data),'MySecretJwtKey');
resolve(token);
}
catch(e){
reject(e);
}
});
}

Now we simply edit the route like so

// This is a route to get a login token via username and password
routerInstance.post(
'/token',
validate(
{
body: {
email: Joi.string().required().email(),
password: Joi.string().required()
}
}
),
passport.authenticate('local',{session:false}),
async (req,res,next) => {
// Check the user data sent, remove the password
let userData = _.omit(req.user,['password']);

// Generate token
let token = await cryptoHelper.generateWebToken(userData);
res.json({token,userData});

}
);

Finally, edit app.js to add the following

server.use(passport.initialize());

If you try to run the server and hit this route with the correct credentials, you should see the token and user object.

All the code up till this point is available here.

It’s been a long tutorial. I hope this was helpful. Please leave comments below and I’ll make sure to add the second part really soon where I talk about further code.

You may have noticed there is no spends stuff yet. That’s because it’s the setting-up article.

W̶i̶l̶l̶ ̶r̶e̶t̶u̶r̶n̶ ̶w̶i̶t̶h̶ ̶m̶o̶r̶e̶ ̶o̶n̶ ̶t̶h̶e̶ ̶s̶p̶e̶n̶d̶s̶ ̶s̶t̶u̶f̶f̶ ̶s̶o̶o̶n̶.̶

You can checkout Part 2 here.

Thanks

--

--