Building a PostgreSQL API in JavaScript with Express and Sequelize (Part 2): migrations and controllers

Josh Frank
Nerd For Tech
Published in
11 min readJun 22, 2021

It’s about time for a follow-up to the first part of my basic JavaScript web API cookbook. It’s a huge topic, and when we left off we’d only made it through setup: creating a basic Express app, provisioning our environment, and configuring Sequelize. On this episode, we’ll get to the main event: models, associations and migrations. Feel free to fork and clone this repository if you’d like to follow along; there are branches for before/after the instructions below to check progress.

5. Create your database

Before creating models, we have to create our database using the instructions we left for Sequelize in database.config.js. The command for that should certainly ring a bell for anyone accustomed to Rails. We might as well add a shortcut for it to our “scripts” in package.json:

...
"scripts": {
"dev": "nodemon -r dotenv/config index.js",
"db:create": "sequelize-cli db:create"
},
...

Sequelize defaults to PostreSQL’s default port, 5432; specify a different port by adding it as a key in your module.exports from database.config.js.

Now run yarn db:create with your fingers crossed and you should see something like this:

Loaded configuration file "app/config/database.config.js".
Using environment "development".
Database my_express_app_development created.
✨ Done in 1.11s.

Qapla’! A quick peek over at Postgres confirms we’ve successfully created a database called my_express_app_development:

6. Create models and migrations

I hate abstract software tutorials — where every model is called BelongsToModel or HasManyModel and every association is called Association and so on. Let’s create Users and Posts instead: a User has many Posts and a Post belongs to a user. To create a database entry for Users, we’ll need to create a model with attributes, which Sequelize will link to a table and its columns. We’ll start with just two columns for now: username and email.

In your terminal, run the equivalent of rails g model in Rails: yarn sequelize-cli model:generate --name User --attributes username:string,email:string. Use commas only, NOT spaces, when specifying attributes after the --attributes flag! It’s very fast so you should immediately see a message confirming you’ve just auto-generated two files: a model file called models/user.js and a migration file with a datestamp, as in: 20210622153117-create-user.js:

New model was created at .../app/models/user.js .
New migration was created at .../app/migrations/20210622153117-create-user.js .
✨ Done in 0.34s.

user.js defines our model and its attributes, and also gives us a framework for defining future associations (more on that later)…

'use strict';const { Model } = require( 'sequelize' );module.exports = ( sequelize, DataTypes ) => {
class User extends Model {
/**
* Helper method for defining associations.
* This method is not a part of Sequelize lifecycle.
* The `models/index` file will call this method automatically.
*/
static associate( models ) {
// define association here
}

};
User.init( {
username: DataTypes.STRING,
email: DataTypes.STRING

}, {
sequelize,
modelName: 'User',
} );
return User;
};

…but to start creating data/rows, we’ll need to run our migration, ...-create-user.js:

'use strict';
module.exports = {
up: async (queryInterface, Sequelize) => {
await queryInterface.createTable('Users', {
id: {
allowNull: false,
autoIncrement: true,
primaryKey: true,
type: Sequelize.INTEGER
},
username: {
type: Sequelize.STRING
},
email: {
type: Sequelize.STRING
},
createdAt: {
allowNull: false,
type: Sequelize.DATE
},
updatedAt: {
allowNull: false,
type: Sequelize.DATE
}
});
},
down: async (queryInterface, Sequelize) => {
await queryInterface.dropTable('Users');
}
};

A Sequelize migration module .exports two functions: up for migrating the table and down for dropping it. Take note:

  • Both up() and down() are async functions, because Sequelize needs to wait for a database connection.
  • To createTable, Sequelize generates a hash of options — feel free to add/keep/remove these familiar, helpful, self-explanatory validations as needed: allowNull, defaultValue, unique and others.
  • Sequelize also, of course, adds a primary key, id, and createdAt and updatedAt timestamps just like ActiveRecord.

Before we migrate, let’s reinforce the habit of writing scripts for important tasks. It’s a good idea to write one for migrating, another for generating migrations if we ever have to update our table, and one for resetting the database so we don’t get a bunch of duplicate primary keys:

...
"scripts": {
"dev": "nodemon -r dotenv/config index.js",
"db:create": "sequelize-cli db:create",
"db:migrate": "sequelize-cli db:migrate",
"db:g:migration": "sequelize-cli migration:generate --name",
"db:reset": "sequelize-cli db:drop && sequelize-cli db:create && sequelize-cli db:migrate"
}
...

Run yarn db:migrate and Sequelize will migrate Users over to your PostgreSQL database. It’ll spit out a confirmation message like the one below:

Loaded configuration file "app/config/database.config.js".
Using environment "development".
== 20210622153117-create-user: migrating =======
== 20210622153117-create-user: migrated (0.021s)
✨ Done in 1.08s.

You can also see for yourself in your Postgres client. All of them offer a CLI for this. Some like SQLPro or Postico offer a spreadsheet-like GUI, which is a little easier than typing SELECT … FROM … repeatedly.

add, commit and push to commit your progress.

7. Create relationships and test them with seed data

Now let’s create a Post model and define the relationship between Users and Posts. There are four Sequelize relationships: hasOne,belongsTo, hasMany and belongsToMany; to keep this tutorial simple we’re creating a hasMany/belongsTo relationship. Consult the Sequelize docs for more details on relationships.

Our posts will have a title and some text content, so run sequelize model:create --name Post --attributes title:string,content:string; as before this will create a model file and corresponding migration file. But this time before we migrate, we’re going to start adding a belongsTo() association to Posts like this:

// ...-create-post.js
'use strict';
module.exports = {
up: async (queryInterface, Sequelize) => {
await queryInterface.createTable('Users', {
id: {...},
title: {...},
content: {...},
createdAt: {...},
updatedAt: {...},
userId: {
type: Sequelize.INTEGER,
references: {
model: "Users",
key: "id",
as: "userId"
}
}

});
},
...
};

In this migration file, we’ll add a column for the userId that a given Post belongsTo(). Options needed for this column include the column’s type and references, which tell Sequelize the belonged-to model’s name, key and alias.

In user.js, let’s build the hasMany() part of this relationship by filling in that method associations():

module.exports = ( sequelize, DataTypes ) => {
class User extends Model {
static associate( models ) {
User.hasMany( models.Post, {
foreignKey: 'userId',
as: 'users',
} );

}
};
...
};

We’ll do likewise to the associations method in post.js, creating the belongsTo() part of the relationship. Note we’re using Sequelize’s onDelete option to enable cascading deletion — a Post disappears from the database whenever the User who posted it is deleted:

module.exports = ( sequelize, DataTypes ) => {
class Post extends Model {
static associate( models ) {
Post.belongsTo( models.User, {
foreignKey: 'userId',
onDelete: 'CASCADE',
} );

}
};
...
};

Now let’s create some seed data— we should test our relationships before we start creating controllers and routes. Naturally we’ll also swing by package.json to define scripts for generating a seed file and for seeding/re-seeding data:

...
"scripts": {
"dev": "nodemon -r dotenv/config index.js",
"db:create": "sequelize-cli db:create",
"db:migrate": "sequelize-cli db:migrate",
"db:g:migration": "sequelize-cli migration:generate --name",
"db:reset": "sequelize-cli db:drop && sequelize-cli db:create && sequelize-cli db:migrate",
"db:g:seed": "sequelize-cli seed:generate --name",
"db:seeds": "sequelize-cli db:seed:all",
"db:reseed": "sequelize-cli db:drop && sequelize-cli db:create && sequelize-cli db:migrate && sequelize-cli db:seed:all"
}
...

Now let’s run g:seed once for each model, each time with a name, as in: yarn db:g:seed userSeeds && yarn db:g:seed postSeeds. In app/seeders, you’ll see seed files generated with the names you provide and timestamps:

New seed was created at .../app/seeders/20210622174800-userSeeds.js .
✨ Done in 0.77s.
...New seed was created at .../app/seeders/20210622174800-postSeeds.js .
✨ Done in 1.03s.

Just like migrations, a Sequelize seeder module has two .exports , both async functions: up, which changes the database, and down which drops those changes if/when necessary. Let’s fill these in using some Sequelize functions, which correspond to SQL commands:

// ...-userSeeds.js
'use strict';
module.exports = {up: async (queryInterface, Sequelize) => {
return await queryInterface.bulkInsert( "Users", [
{ username: "Josh", email: "josh@hirejoshfrank.com", createdAt: new Date(), updatedAt: new Date() }
] );
},
down: async (queryInterface, Sequelize) => {
return await queryInterface.bulkDelete( "Users", null, {} );
}
};// ...-postSeeds.js
'use strict';
module.exports = {up: async (queryInterface, Sequelize) => {
return await queryInterface.bulkInsert( "Posts", [
{ title: "Josh's first post", content: "I see a ship in the harbor", userId: 1, createdAt: new Date(), updatedAt: new Date() },
{ title: "Josh's second post", content: "I can and shall obey", userId: 1, createdAt: new Date(), updatedAt: new Date() }
] );

},
down: async (queryInterface, Sequelize) => {
return await queryInterface.bulkDelete( "Posts", null, {} );
}
};

Sequelize gives us tons of options for querying! Here we’re bulkInserting row objects, each of which includes the entry’s createdAt and updatedAt timestamps; Sequelize kindly lets us use an ordinary new Date() for that. Note also that we provide a userId for the belonger (Posts).

If all has gone according to our foul plan, when you run yarn db:seeds you should see the optimistic, Rails-like message below:

Loaded configuration file "app/config/database.config.js".
Using environment "development".
== 20210622153117-create-user: migrating =======
== 20210622153117-create-user: migrated (0.016s)
== 20210622172615-create-post: migrating =======
== 20210622172615-create-post: migrated (0.011s)

If Sequelize runs without hitting its head on the allowNull: falses or any of the other validations in our migrations, it means our relationships are working successfully! Feel free to confirm in your Postgres client with SELECT * FROM “Users" andSELECT * FROM “Posts" that your database looks as it should:

8. Creating controllers and routes

Let’s tie this all together into a real, working API — the reason for all this fuss — so we can see those models and relationships at work in HTTP/JSON and not just in an SQL console. We’ll do that by creating controllers (and a folder in /app for them) and start defining routes that point to those controllers’ actions. We’ll start with just show routes/actions for now.

Run mkdir app/controllers from your project’s root directory to create a folder for controllers. Then touch postsController.js and fill it with the code below:

const Post = require('../models').Post;exports.show = ( request, response ) => {
return Post.findByPk( request.params.postId, {} )
.then( post => {
if ( !post ) { response.status( 404 ).send( { error: "Post not found" } ); }
else { response.status( 200 ).send( post ); }
} )
.catch( error => response.status( 400 ).send( error ) );
};

Sequelize controller exports are all functions called when a user visits a route, so they take HTTP requests and JSON responses as arguments. Model.findByPk() does exactly what it says on the can: accepts a database index and an options object as an argument and returns the corresponding database row. It’s all chained up with .then().catch() so we can send a 200 response with our data if the database index in the request is valid, or a 400/404 response if isn’t.

Next, touch app/controllers/index.js and enter just the two lines of code below; it’s needed to require() controller actions quickly and conveniently when we start defining routes:

const posts = require( "./postsController.js" );module.exports = { posts };

Now let’s run mkdir app/routes && touch app/routes/index.js from your project’s root directory to create a folder for routes and a routes file. As your project grows you may decide to separate routes files by model, i.e., postRoutes, userRoutes and so on. For now, though, we’ll put all our routes in one file, app/routes/index.js, containing the code below:

const postsController = require( "../controllers" ).posts;module.exports = app => {
// Post routes //
app.get( "/posts/:postId", postsController.show );
};

Finally, let’s require the routes/index.js file we just created in our main app.js file all the way down at the root of our project:

require( "dotenv" ).config();
const express = require( "express" );
const cors = require( "cors" );
const app = express();
app.use( express.json() );
app.use( express.urlencoded( { extended: true } ) );
app.use( cors( { origin: `http://localhost:${ process.env.PORT }` } ) );
require( "./app/routes" )( app );app.get( "/", ( request, response ) => response.send( "Test" ) );app.listen( process.env.PORT, () => console.log( `Listening: port ${ process.env.PORT }` ) );

Now, at long last, when we fire up our app with yarn dev, we should see the familiar Listening: port 3000 message telling us that our app is running and listening without errors. That means if you open a new terminal and run curl localhost:3000/posts/1 , you’ll see the first Post from our seeds:

{"id":1,"title":"Josh's first post","content":"I see a ship in the harbor","createdAt":"2021-06-22T19:39:30.140Z","updatedAt":"2021-06-22T19:39:30.140Z","userId":1}

And in the terminal window where you’re running your app with yarn dev, a message will flash confirming that Sequelize ran an SQL command:

Executing (default): SELECT "id", "title", "content", "createdAt", "updatedAt", "userId" FROM "Posts" AS "Post" WHERE "Post"."id" = '1';

Let’s touch app/controllers/usersController.js and add some User routes — but this time serializing Posts each time we show a User. We’ll do that by using an object with options instead of an empty object in User.findByPk(). Then, we’ll incorporate actions from that usersController in some new routes with additions to controllers/index.js and routes/index.js:

// usersController.jsconst User = require('../models').User;
const Post = require('../models').Post;
const userOptions = {
include: [
{
model: Post,
as: "posts"
}
]
};
exports.show = ( request, response ) => {
return User.findByPk( request.params.userId, userOptions )
.then( user => {
if ( !user ) { response.status( 404 ).send( { error: "User not found" } ); }
else { response.status( 200 ).send( user ); }
} )
.catch( error => response.status( 400 ).send( error ) );
}
// controllers/index.jsconst users = require( "./usersController.js" );
const posts = require( "./postsController.js" );
module.exports = { posts, users };// routes/index.jsconst postsController = require( "../controllers" ).posts;
const usersController = require( "../controllers" ).users;
module.exports = app => {
// Post routes //
app.get( "/posts/:postId", postsController.show );
// User routes //
app.get( "/users/:userId", usersController.show );

};

Now, a curl localhost:3000/users/1 terminal command will give you the User with a userIdof 1, but also that user’s two posts serialized within the response:

{"id":1,"username":"Josh","email":"josh@hirejoshfrank.com","createdAt":"2021-06-22T19:39:30.132Z","updatedAt":"2021-06-22T19:39:30.132Z","posts":[{"id":1,"title":"Josh's first post","content":"I see a ship in the harbor","createdAt":"2021-06-22T19:39:30.140Z","updatedAt":"2021-06-22T19:39:30.140Z","userId":1},{"id":2,"title":"Josh's second post","content":"I can and shall obey","createdAt":"2021-06-22T19:39:30.140Z","updatedAt":"2021-06-22T19:39:30.140Z","userId":1}]}%

Coming up next

We now have two simple models and a working hasMany/belongsTo relationship between them, plus routes and controllers that handle them. That makes it a basic, working web API, and that’s no small feat! But there’s still a lot more to do. I’d like to add at least 2 more posts to this series: a walkthrough of adding auth with bcrypt and jsonwebtokens, and another on deploying your app. There’s so much more about this subject worth highlighting, so stay tuned!

--

--

Josh Frank
Nerd For Tech

Oh geez, Josh Frank decided to go to Flatiron? He must be insane…