The Modern API, Part 2

Adam Gall
Decent DAO
Published in
8 min readNov 6, 2019

In The Modern API, Part 1, we went over how to set up your development environment for a Node API. Here in Part 2, we’ll start writing some code. We will write some “API startup” logic, create a root route ( GET /) which displays information about the API, and set up the test suite to start testing endpoints and checking responses.

TL;DR: https://github.com/decentorganization/decent-api

API Startup

Our “start” scripts in package.json execute src/index.js, so let's start there.

src/index.js

Imports the http library so we can listen and respond to requests.

Imports dotenv and configures it so that we can pull in local environment variables when running the API.

Imports debug and creates an instance of it which is a nice library to assist with debug logging.

Imports two functions called api and databaseSetup which we've created elsewhere in the repo, which get us instances of a database and an Express API (shocking).

Defines a function called normalizePort which tries its hardest to return an integer based on its input.

Let’s see what that databaseSetup function does.

src/database/index.js

Imports debug and creates an instance of it for nice debug logs.

Imports path so we can safely build the path to our "migrations" directory.

Imports knex-db-manager, which is a nice library we use to manage the "database server", instead of just a single database. We can use this library to create new databases. The standard knex library operates on the concept of a database which already exists.

Imports a function called setDatabase which we'll use to build up a set of database functions to manipulate CRUD data. We won't get too deep in here in this guide, look out for Part 3 when we model out some business logic.

The single exported function called databaseSetup in here does a lot.

First, it creates a config object which is passed into the knex-db-manager's databaseManagerFactory function, to give us an instance of our database manager object, which I've called dbManager.

As we learned from src/index.js, this function returns a Promise. Let's step through it.

First, use the dbManager to createDbOwnerIfNotExist, to make sure we have proper permissions to create new databases.

Next, get an instance of the database itself with knexInstance, (which is based on the database name passed into this function, and set in the config object).

If this is the first time that the API has been started, and we never previously created any database, then what’s going to happen? We need to check for this case, and then create a database if necessary. The best way I’ve found to do this is to just send a raw query to the database, and see if it errors out or not.

Next, we perform migrations, and then the last step in our promise chain is to build a little object that includes the manager (which is used in the test suite), and the database (which itself is an object that implements a bunch of specific CRUD operations that the API uses, which we'll get into in the next guide).

src/api.js

Nice and simple in here.

We export this function called api, which takes a database object as input. First, it creates an express server instance. Then, it passes that server instance into all of those middleware functions, one by one. It also passes the database object into our storage middleware function. Finally, return that express server.

src/middleware/index.js

Imports debug and creates an instance of it for nice debug logs, which we'll use in our error handler.

Imports express, which we'll use to pull the json middleware function off of.

Imports morgan, which is a nice library for logging HTTP requests.

Imports cors, a library that implements middleware which enables CORS support for the API.

The cors and encoding functions are simple enough. They take the express api as input, and apply the CORS and JSON middleware. This lets our API support CORS, and properly respond to JSON requests (with JSON responses).

The logging function sets up our morgan import. Basically, we don't care about logging HTTP requests when running tests. When developing locally, print some nice simple color-coded logs, and on any other environment print out more detailed logs.

The storage function takes the api and a db as input, and registers a middleware function which sets a "local" value that we define on the spot, called db, setting it to the db which was passed into the function. Then, it calls next() to continue with the middleware chain. This is crucial to the operational integrity of our API, as we now have a database which lives within each request. We don't need some global database object throughout our codebase, for every request lifecycle we can pull it right out of the request. This allows us to set a specific database (or database mock object...) when running the app in dev, staging, production, or tests.

The errors function registers a middleware with 4 inputs. This is special syntax in Express, as it signifies that this is an "error" handling middleware. Any error thrown in the app will get caught by this middleware. We've got some logic here that crafts up specific error messages based on the type of error (whether it was expected and caught by the app logic, or some other unknown error), and which environment we're running in. Finally, it prints the error out onto the console via debug.

src/router/index.js

The final piece of our API startup logic. The exported function router just registers a couple of routes called / and /notes. In this guide we'll see how the root / route is built. We'll cover our notes CRUD next time.

The root handler function for our / route takes in the express API instance as a parameter. Let's go check that out.

Root Route

Finally, our app is set up and ready to handle requests. Let’s see how the root GET / route is configured and handled.

src/router/root.js

Imports the Router function from express, so that we can create a new route handler.

Imports a function called getRoot from our root controller.

Our exported root function takes the express API as an input. It creates a new Router instance, then registers a .get method for the / route, calling getRoot(api) as the handler.

src/controllers/root.js

Imports child_process which allows us to execute commands on the underlying system.

Imports express-list-endpoints, a package that takes an express API and outputs a object containing all of the registered routes.

Builds up a version variable of the form ( version from package.json , then a +, then the short git hash from the commit that is currently checked out).

The exported getRoot function here returns a handler which sends a 200 response object containing some information about the API.

The JSON response keys include:

  • ‘👋’, with a value of ‘🌎’ (because emojis are fun)
  • 'name', with a value of the name of the project from package.json
  • 'environment', with a value of the NODE_ENV environment variable
  • 'version', with a value of that version variable that we built above
  • 'endpoints', with a value of the object that express-list-endpoints spits out

For example, it looks something like this:

{ 
"👋": "🌎",
"name": "decent-api",
"environment": "development",
"version": "0.0.0+d314f03",
"endpoints": [
{
"path": "/",
"methods": [
"GET"
]
}
]
}

Cool! So if this API is running on your local machine using the default port, then if you hit http://localhost:3000 in your browser, you'll see the above output (Firefox will even format the JSON nicely for you).

Let’s go write some tests.

Testing

No good software project is complete without a test suite. We’ll cover how to set up the test suite, then write some integration tests for our root endpoint.

When we call mocha through our test script ( yarn test), it just executes any files that end in .spec.js that live in the root of our test directory. So, we'll create an index.spec.js there which will trigger the rest of our tests.

When running tests, be sure your local database server is running via docker-compose up.

test/index.spec.js

This is simple. Just import our integration tests (via a function called integration), then call it.

test/integration/index.spec.js

Imports dotenv and configure it in case we're not using environment variable defaults.

Imports a root function which is our root integration tests. Also imports a notes integration test function but we'll cover that in the next guide.

In here is where we set up some architecture for the integration tests. The exported integration function wraps everything in a describe block called "integration". Next, we define a couple of "private" variables called _testDbManager and _testApi, which will hold instances of the database server manager, and an express API.

The before block, which is called before any tests in the current describe block are executed, calls databaseSetup with the name "test" (from NODE_ENV environment variable). This will create a new database called "test", then return the database manager details to the resolved promise. We'll take the manager and save it into _testDbManager, then pass the database into api() to create a new instance of our Express API, and save that along into _testApi.

We’ve also created an after block (which is executed after all tests in the describe block are finished running). In here, we use our database manager to drop the "test" database, then close our connection to the local server.

It’s important to realize here that we are not starting an HTTP server at all, like we do in src/index.js. We just have an instance of an Express API, which we manually pass requests into (see the next section).

Smacked between our before and after blocks we call our imported root function, which as an inputs takes a function which returns our _testApi instance, and a "route" for this test (/).

Let’s see what this root function does.

test/integration/root.js

Imports chai and chai-http, then configures them, so that we can use chai for testing our API requests.

The exported root function takes as inputs a function called api, and the route. It's wrapped in a describe block named ${route} route tests, which inside of it has another describe block named GET ${route}. Since our root route only supports the GET method, we don't have any other describe blocks in here. If we allowed users to POST to /, then we'd have another describe block as a sibling of our GET ${route} block called POST ${route}.

The tests themselves are simple. First, we create a variable called response, then we have a before block which is called before any tests are ran. This before block uses chai.request and the instance of our API (which we get by calling the api function that was passed in), to call GET / (via .get(route)), and then saves the response into the response variable.

Next, we define some tests using it blocks to check various things on that response object. It should have a 200 status code. It should be an object. It should have the "hello world" emojis. It should return the project name as a string. It should return the environment. It should return the version. It should return an array of routes.

integration
/ route tests
GET /
✓ should have 200 status
✓ should return an object
✓ should return hello world emojis
✓ should return project name
✓ should return environment
✓ should return version
✓ should have a list of endpoints
7 passing (743ms)

Conclusion

In this post we learned how to bootstrap an Express API using a modular, reusable architecture that decouples the database, the Express API, and the HTTP server.

We created a root route handler, and made it return various configuration information about the API itself.

Finally, we built a test suite that we’ll be able to easily add onto as we create more business logic, models, routes, and controllers.

Join me in the next post where we’ll go over how to build some CRUD around the concept of “notes”. Or just go look at the code, yourself, right now.

Originally published at https://blog.decentlabs.io on November 6, 2019.

--

--

Adam Gall
Decent DAO

master of web and mobile applications, and their integrations with bitcoin and ethereum