No-Frills Guide to Creating a REST API with Node (Part 1)
Simply put, you can get lots done with an API.
You can embed the functionality of a full SaaS platform into an API—pair a REST API with a Vue.js app, and you’ve got an entire product.
In this straight-forward series of posts, I’ll talk about how to build an API using Node. There are an abundance of articles that cover this topic, and many present a way to build an API, but most of them don’t go into the simple reasoning behind why things are setup the way they are.
Things can be setup many ways, so the why is very important.
To address the why, let’s setup a goal and some guidelines. The goal we want is to create a REST API that powers a todo app. We want to achieve this as simply as possible, in a way that allows us to easily:
- Add to the API
- Document the API
- Modify the API
- Analyze the usage of the API
- Test the API
Diving Right in
The first post — this one — will cover how to create a minimal API with configs, some security, logging, and linting. Future posts will build on the initial foundation until the end goal is achieved. If you’re interested make sure to follow me.
The topics that will be covered in the series are:
- Minimal API example, dev workflow, configs, logging, eslint (this post)
- Creating Controllers, versioning, and using the debugger
- Models, documentation, and manual testing with Swagger
- Boot, utility methods, and services
- Error handling, rate limiting, and remote logging, paging
- Users, authentication and security
- Analytics
- Benchmarking
- Internationalization
- Going live (Nginx, SSL, CI)
Putting APIs to Work
There are many options for creating APIs. REST is just one. It is a stable and easily understood concept. Pairing a REST API with a Single Page App is a great choice for creating a custom product. There are architectural alternatives, like the Universal App, but there is something straight forward and simple about the separation of API and app.
APIs often need a database. We’re going to use Mongo. It is a good database for exploration. We’ll use the Mongoose ODM, since it makes creating and saving documents easy. Using raw Mongo is a strong choice too, but using an ODM will save us some typing in the long run, and it’ll help organizing the contents of our collections.
We’ll use express and a custom setup. There are alternatives to express, but express is still the most popular base web app framework for node, so it’s ecosystem of middlewares is vast. There are some nice full stack frameworks out there too, but we want to keep things simple. Using a framework adds a layer of complexity that is not needed in our use case.
There is a companion Github repo for this project at: https://github.com/pajtai/rest . If you clone the repo ( git clone git@github.com:pajtai/rest.git
), you can checkout the commits referenced as you read (make sure to run, npm i
., after the checkout. This lets you try out the app in its various stages.
Take a look at the README of the repo for a list of the technologies used throughout the series.
A Minimal API
Let’s get going. First we’ll create an empty directory and initialize it:
mkdir rest
cd rest
npm init -y
npm i express
If you want to version control your work, don’t forget to git init:
git init
echo node_modules > .gitignore
git add .
git commit -am "initial"
We’ll create a minimal express app. It’s a good idea to have an endpoint that returns the api version. This can save you a lot of headaches when you’re trying to debug things later. So this minimal api will have a single endpoint for the version:
'use strict';
const express = require('express');
const app = express();
const { version } = require('./package');
console.log('starting app');
app.get('/version', (req, res) => {
res.send({ version });
});
console.log('listening at http://localhost:3333');
app.listen(3333);
Start it with node index.js
and open your browser to http://localhost:3333/version to test it.
Handling 404s
If you open http://localhost:3333 instead of the version endpoint, you’ll get a “Cannot GET /” error that doesn’t look like a legit api response. We’ll add that in by typing the following:
app.get('*', (req, res) => {
res.send({ message: 'Resource not found' }, 404);
});
This is a general pattern in express. The order routes are added in is important. Express tries them in order. If a route is handled, the others are not tried. So if the last route is a 404 and we get to it, that means we haven’t figured out what to do and we send a 404.
Nodemon
To try out the changes, stop node and restart it. I find having to do this annoying, so let’s setup a dev workflow that allows us to edit files and have the app restart automatically. Nodemon is a good option for this. Often, people install utility npms like Nodemon globally. However, it is a pain to manage these global dependencies. You have to reinstall all of them again when you upgrade node. We can let package.json handle it for us. Packages installed locally are available from the scripts in package.json. So to add nodemon we do:
npm i nodemon
Then we add the following line to the “scripts” section in package.json:
"dev": "nodemon index.js",
Now we can work on the app using
npm run dev
And as we change files, the app will auto reload.
You still have to reload the page you’re looking at. This is usually fine for apis. Adding browser-sync would be an option for automatic reloading, but you’ll run into issues with the timing of the app restart and the browser refresh. It’s possible to work around, but the solution would be too complicated for my taste, so unless it really annoys you, let’s move on.
Logging
The next thing to tackle in terms of general setup is logging. Console logs work okay, but you don’t get a time stamp unless you manually add it in. Also — as your app grows — you’ll want to have some logs you only show for debugging and some logs you show all the time. You can achieve this using log levels. You’ll probably want different logging levels in different environment. Standard logger npms have both timestamps and log levels built in. A nice option for logging is pino. Pino has log levels, date stamps, outputs in json for easy parsing, and a bunch of other stuff. Of course, you don’t want to read json output, so pino has a cli to run the output through to make it look pretty. Let’s add some logging:
npm i pino
We have to instantiate our logger and specify the minimum logging level to show:
const pino = require('pino');
const logger = pino({
level: 'debug'
});
Now we can log using various levels:
logger.info('starting app');
And to get the pretty logging, we modify package.json:
"scripts": {
"dev": "nodemon index.js | pino",
Configs
We want to change the logging level for production vs local dev. Also, we want to be able to quickly change the port number and use different port numbers on different environments. These issues can be solved using dotenv.
npm i dotenv
Create a file at the root of the project called .env
. One thing to pay attention to is that everything pulled in by dotenv
is a string. If we need a number or boolean, we (or someone else) has to cast.
This is our current .env
:
PORT=3333
LOG_LEVEL=debug
And we pull it onto process.env
using:
require('dotenv').config();
We initialize the logger with:
const logger = pino({
level: process.env.LOG_LEVEL
});
And we can start listening like:
logger.info(`listening at http://localhost:${process.env.PORT}`);
app.listen(process.env.PORT);
Express is smart enough to cast to a number for the port.
Security
Let’s add some security using helmet:
npm i helmet
And we can use it as middleware on our app:
const helmet = require('helmet');
app.use(helmet());
Note that requiring helmet pulls in a function that you must call to get the instance of the middleware.
We should also run npm audit
on the project with each deploy.
Linting
Finally, it’s really easy to make syntax errors when writing JavaScript. ESLint can help catch a lot of these for you. It will save you hours of debugging. Let’s add ESLint to our workflow using their CLI helper. There are many ESLint modules you can use as a base, but it’s simplest to just have ESLint analyze our code and generate rules based on that:
» npm install eslint --save-dev
» ./node_modules/.bin/eslint --init? How would you like to configure ESLint? Inspect your JavaScript file(s)
? Which file(s), path(s), or glob(s) should be examined? index.js
? What format do you want your config file to be in? JSON
? Are you using ECMAScript 6 features? Yes
? Are you using ES6 modules? No
? Where will your code run? Node
? Do you use JSX? No
Determining Config: 100% [==============================] 0.3s elapsed, eta 0.0sEnabled 251 out of 257 rules based on 1 file.
Successfully created .eslintrc.json file in ./rest
Adding the following to “scripts” in our package.json
will run Eslint with npm run lint
:
"lint": "eslint .",
Try deleting a semicolon and running npm run lint
, you’ll see the error.
To get our dev script to lint is a little trickier. We can make use of the exec
option:
"dev": "nodemon index.js --exec 'npm run lint && node' | pino",
One thing to note is that piping removes the colors from Eslint. This is a common characteristic of command line tools. We can force the colors from Eslint to show up by adding the --color
option.
"scripts": {
"dev": "nodemon index.js --exec 'npm run lint && node' | pino",
"lint": "eslint --color .",
Try npm run dev
and delete a semicolon. Note the error, add the semi colon back in, and see how the app starts back up.
Next Steps
Now we have a minimal api that returns the version number of our app and handles 404s. We’ve also got a comfortable workflow using npm run dev
. It’s time to get to the real work. Let’s start adding endpoints for storing and retrieving todos. Those are new features. When they’re done we’ll bump the minor version. Until then we’ll bump to a preminor version using npm preminor
.
Be sure to follow me on Medium, and stay tuned for the next post.