Tutorial: Setting up Node.js with a database
I could have named this tutorial “setting up Node.js with MySQL” however I wanted to make the name a little more encompassing because I think for most applications MySQL is a great choice for Node.js.
MySQL has been around for 21 years and so in early Node.js apps it was often overlooked for some new and shiny NoSQL choices such as MongoDB. However with the benefit of hindsight, MongoDB turned out to be more hype than substance and is well known for a few major security issues and performance issues. If you don’t believe me, search Hacker News for “mongodb”.
If you would like a DB, even for NoSQL, MySQL is a great choice. It has great support, decent performance and there are some great NPM modules for interfacing with it. Furthermore there are MySQL adapters for pretty much everything.
In this tutorial, I will show how to setup MySQL with Node using Knex in order to create a crude app for creating and logging in users.
We will
- Install Node.js
- Install MySQL
- Create a HTTP API for writing to the database
- Create some HTML and JS to POST to the API
- Use Knex migrations to create a user database schema (a similar process to Rails migrations)
- Use a migration to setup password hashing
- Create a login route to check the hashing works
In the next tutorial we will add more functionality to the app and explore some more complex querying. Eventually we will deploy the app somewhere.
Installing Node.js
If you have completed the previous tutorials, you probably have Node installed so you can skip this section.
We will be using NVM to manage node so that we can easily install and switch between different versions.
Before installing NVM make sure you have a .bashrc
file in your home directory (even if the file is empty).
touch ~/.bashrc
To install NVM just run this command (same as in the NVM installation instructions).
curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.32.1/install.sh | bash
This command pulls down a script from a remote URL and runs it.
You now have NVM! But if you run nvm ls
you will notice it isn’t found. This is because NVM adds some code to your ~/.bashrc
. This file is a special file that is run every time you log in to your instance, so to get NVM running you could logout and login again. However you can just run the file manually by using the source
command.
source ~/.bashrc
Now running nvm ls
works! But there aren’t any node versions installed! To get the latest version, just nvm install <latest version number>
.
nvm install 7
You have node installed! Check that it’s there by looking at the version
node --version
Installing MySQL
If you are using Mac, you should install MySQL using Homebrew. If you are using another platform just Google something like “installing MySQL on my platform” for instructions.
Install Homebrew on mac by running this in terminal
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
Once installed, update Homebrew and install MySQL
brew update
brew install mysql
Run the MySQL daemon
brew services start mysql
Set a rudimentary password for MySQL locally, there is no need to worry about security because it will only be accessible from your machine (I used “password”).
mysqladmin -u root password 'password'
Once installed and running, MySQL is accessible to any app running locally on your machine.
In this tutorial, we will use Sequel Pro to manage our database. Download and install Sequel Pro and open it. The database to connect to is on localhost so set “host” to “127.0.0.1”. Username is “root” and password the same as chosen in the previous step, for me that’s “password”.
Connect and you will be presented with an empty MySQL. If the connection does not work, make sure you modified your password correctly in the previous step and that the mysql
service is still running.
After successfully connecting, click Database > Add Database and name it, I chose the name “tutorial_node_database” (the MySQL naming convention is snake_case).
Great, MySQL is ready to go!
Project setup
I have added this whole project to a public github repo and I have broken down each step as a commit so it should be easy to follow. This is just as an aid, we will be creating the project from scratch so there is no need to clone the code.
Let’s start with a fresh directory, I have named the project tutorial-node-database
. Run the following to create the file structure.
mkdir tutorial-node-database
cd tutorial-node-databasetouch knexfile.js
touch index.js
touch store.jsmkdir public
touch public/index.html
touch public/app.js# choose all defaults when prompted
npm init
We are using Knex with MySQL and creating a small API using express, so you need to install knex
and express
. We’ll also use the body-parser
module.
npm i knex mysql express body-parser --save
knex
should be installed globally also
npm i knex -g
When creating a new project I like to add a minimal .gitignore file so that I can easily check everything in without including the node_modules
folder or .DS_Store
files that MacOS likes to create.
printf "node_modules\n.DS_Store" > .gitignore
Don’t worry if you don’t understand the bash code above. All it does is put node_modules
followed by a newline \n
followed by .DS_Store
into a file called .gitignore
. It’s perfectly reasonable to use an editor to do this sort of thing if that’s more simple for you.
HTTP API
We’ll start with a simple express app, serving the public directory, with a /createUser
endpoint to write some data to the database.
const express = require('express')
const bodyParser = require('body-parser')const store = require('./store')const app = express()
app.use(express.static('public'))
app.use(bodyParser.json())
app.post('/createUser', (req, res) => {
store
.createUser({
username: req.body.username,
password: req.body.password
})
.then(() => res.sendStatus(200))
})app.listen(7555, () => {
console.log('Server running on http://localhost:7555')
})
Save this to index.js
. When we run this file, the server will listen on http://localhost:7555 for POST requests to /createUser
and pass those requests to the store file. The server will then respond with a 200 status code (200 is the code that every HTTP server uses to say that the request was successful).
For the time being, we will mock the store in order to check that our API works.
module.exports = {
createUser ({ username, password }) {
console.log(`Add user ${username} with password ${password}`)
return Promise.resolve()
}
}
Save this to store.js
. Note that the addLog
function returns a promise so that you know when it is done. If the use of promises here is a little confusing, it might be worth checking out some Promise tutorials, they are an essential concept in modern Javascript.
Now, lets write a simple index.html
, including app.js
. We add app.js
at the bottom of the HTML here rather than the head
so that we can easily query elements above the script, without needing to use something like jQuery.ready
or the DOMContentLoaded
event to wait for them to load.
<!DOCTYPE html>
<html>
<head>
<title>Node database tutorial</title>
</head>
<body>
<form class="CreateUser">
<h1>Create a new user</h1>
<input type="text" class="username" placeholder="username">
<input type="password" class="password" placeholder="password">
<input type="submit" value="Create user">
</form>
<script src="/app.js"></script>
</body>
</html>
The app.js
script should look like the below and will be used to hit the server’s /createUser
endpoint.
const CreateUser = document.querySelector('.CreateUser')
CreateUser.addEventListener('submit', (e) => {
e.preventDefault()
const username = CreateUser.querySelector('.username').value
const password = CreateUser.querySelector('.password').value
post('/createUser', { username, password })
})function post (path, data) {
return window.fetch(path, {
method: 'POST',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
body: JSON.stringify(data)
})
}
Save this to public/app.js
and start your server.
node .
Your server should start on http://localhost:7555/. Go to this link and you will be able to see createUser requests in the network panel when you click “Create User”.
Check your terminal for the console.log
from store.js
.
This is a good point to commit your code if you are using git.
# An empty first commit is nice
git commit --allow-empty -m "( ͡° ͜ʖ ͡°)"git add .
git commit -m "Setup server with mock store"
Setting up knex
Let’s set up the knexfile.js
to connect to the database.
module.exports = {
client: 'mysql',
connection: {
user: 'root',
password: 'password',
database: 'tutorial_node_database'
}
}
I have used default credentials to connect locally to MySQL, you may need to change this if yours are different.
For the tutorial we will need a user table with a username and password. These all need to be defined in schema. To create or modify schemas, we will use Knex migrations.
Create a new migration and name it something meaningful
knex migrate:make create_user_table
This will create a file like migrations/20170504155313_create_user_table.js
. The name of the file is the current datetime followed by the name you specified. The datetime keeps the migration files in order in the filesystem, making them easy for you to read in chronological order.
Every time we need to add, remove or modify a table or column in order to change how our data will be represented, we’ll write a new migration file. This allows us to
- Make schema changes programmatically that can be checked into git and reviewed (far better than updating schema using MySQL directly)
- Create a history of schema updates
- Use JS rather than SQL to transform values (with the ability to use NPM modules and functions internal to the project)
- Keep all the project collaborators’ database schema up to date, they can run the latest migrations to get the latest changes
- Update staging and production databases when deploying code changes
- Rollback schema changes
The created file contains code which exports an up
method and down
method. We will create our table schema in the down
method (used when we migrate) and delete it in the up
method (when we roll back).
exports.up = function (knex) {
return knex.schema.createTable('user', function (t) {
t.increments('id').primary()
t.string('username').notNullable()
t.string('password').notNullable()
t.timestamps(false, true)
})
}exports.down = function (knex) {
return knex.schema.dropTableIfExists('user')
}
The API for Knex is pretty straightforward and the docs are very simple to read. It is important to realize that Knex is actually constructing SQL from the calls you make. Although you are using a JavaScript API, a lot of queries are built with the underlying SQL in mind which can be very important when it comes to query performance and optimization.
Note that we add an id
fields here that will automatically increment as well as updated_at
and created_at
fields using the t.timestamps
method.
To run this migration
knex migrate:latest
Check SequelPro to see the new table with the schema specified
Now we have a table, we can edit store.js so that it writes data to the table.
Using Knex.js to write data
All we need our store to do is load in knex using the knexfile.js
config and then write data to the user
table whenever a createUser
request is made.
const knex = require('knex')(require('./knexfile'))module.exports = {
createUser ({ username, password }) {
console.log(`Add user ${username} with password ${password}`)
return knex('user').insert({
username,
password
})
}
}
Restart your server and browse to http://localhost:7555/ to create a user. If you go to Sequel Pro you should see the new user show up in the database.
git add .
git commit -m "Setup knex"
Storing the password safely
Let’s make sure the password is encrypted in the DB before going any further. The flexibility of using database migrations really comes in handy.
To safely store passwords, you want to create a random salt for each new user and use crypto to encrypt the password (this is standard practice and safeguards against leaking your user’s passwords if your database is compromised).
Before the migration, let’s add the encryption logic into the store.js file
const crypto = require('crypto')
const knex = require('knex')(require('./knexfile'))module.exports = {
saltHashPassword,
createUser ({ username, password }) {
console.log(`Add user ${username}`)
const { salt, hash } = saltHashPassword(password)
return knex('user').insert({
salt,
encrypted_password: hash,
username
})
}
}function saltHashPassword (password) {
const salt = randomString()
const hash = crypto
.createHmac('sha512', salt)
.update(password)
return {
salt,
hash: hash.digest('hex')
}
}function randomString () {
return crypto.randomBytes(4).toString('hex')
}
Note that crypto
is a native node module and need not be installed using npm
.
Let’s make a migration file
knex migrate:make encrypt_user_password
For the migration, we are going to add encrypted_password
and salt
columns, then we are going to borrow the saltHashPassword
method that we exported from store.js
in order to migrate users already in the database. Finally, we will remove the password
column.
const { saltHashPassword } = require('../store')exports.up = function up (knex) {
return knex.schema
.table('user', t => {
t.string('salt').notNullable()
t.string('encrypted_password').notNullable()
})
.then(() => knex('user'))
.then(users => Promise.all(users.map(convertPassword)))
.then(() => {
return knex.schema.table('user', t => {
t.dropColumn('password')
})
}) function convertPassword (user) {
const { salt, hash } = saltHashPassword(user.password)
return knex('user')
.where({ id: user.id })
.update({
salt,
encrypted_password: hash
})
}
}exports.down = function down (knex) {
return knex.schema.table('user', t => {
t.dropColumn('salt')
t.dropColumn('encrypted_password')
t.string('password').notNullable()
})
}
We can’t get the original password back in the down function because it is encrypted, hence we just try our best by putting the schema back as it was.
There is a tonne of promises being used here which is common when writing migrations. Using async/await we can greatly simplify this code.
const { saltHashPassword } = require('../store')exports.up = async function up (knex) {
await knex.schema.table('user', t => {
t.string('salt').notNullable()
t.string('encrypted_password').notNullable()
})
const users = await knex('user')
await Promise.all(users.map(convertPassword))
await knex.schema.table('user', t => {
t.dropColumn('password')
}) function convertPassword (user) {
const { salt, hash } = saltHashPassword(user.password)
return knex('user')
.where({ id: user.id })
.update({
salt,
encrypted_password: hash
})
}
}exports.down = function down (knex) {
return knex.schema.table('user', t => {
t.dropColumn('salt')
t.dropColumn('encrypted_password')
t.string('password').notNullable()
})
}
If you want to use async await, ensure you have the latest node version (it only recently became supported)
nvm i 7
When you install a new node version, be aware that global packages will need to be reinstalled.
npm i knex -g
Let’s run the migration and check that it works.
knex migrate:latest
If everything runs smoothly, your password should have been encrypted.
Restarting your server and creating new users should encrypt the passwords. Go to http://localhost:7555/ and create some more users to check it works.
git add .
git commit -m "Encrypt passwords"
Add a login route
At this point, let’s install nodemon
to restart the server automatically when we make changes.
npm i -g nodemon
Now run the server using nodemon, rather than node
nodemon .
Next time we make changes, the server will restart for us. Usually you would leave this running and then run bash operations in a new terminal tab.
To authenticate a user we create a /login
route which calls the store’s authenticate
method with the username and password.
const express = require('express')
const bodyParser = require('body-parser')
const store = require('./store')
const app = express()app.use(express.static('public'))
app.use(bodyParser.json())app.post('/createUser', (req, res) => {
store
.createUser({
username: req.body.username,
password: req.body.password
})
.then(() => res.sendStatus(200))
})
app.post('/login', (req, res) => {
store
.authenticate({
username: req.body.username,
password: req.body.password
})
.then(({ success }) => {
if (success) res.sendStatus(200)
else res.sendStatus(401)
})
})app.listen(7555, () => {
console.log('Server running on http://localhost:7555')
})
If the authentication is successful, we respond with 200, else we respond with 401 (unauthorized).
In the store, we need to get the user’s salt, use it to encrypt the password and then check it against the encrypted_password
in the database.
const crypto = require('crypto')
const knex = require('knex')(require('./knexfile'))module.exports = {
createUser ({ username, password }) {
console.log(`Add user ${username}`)
const { salt, hash } = saltHashPassword({ password })
return knex('user').insert({
salt,
encrypted_password: hash,
username
})
},
authenticate ({ username, password }) {
console.log(`Authenticating user ${username}`)
return knex('user').where({ username })
.then(([user]) => {
if (!user) return { success: false }
const { hash } = saltHashPassword({
password,
salt: user.salt
})
return { success: hash === user.encrypted_password }
})
}
}function saltHashPassword ({
password,
salt = randomString()
}) {
const hash = crypto
.createHmac('sha512', salt)
.update(password)
return {
salt,
hash: hash.digest('hex')
}
}function randomString () {
return crypto.randomBytes(4).toString('hex')
}
We have also modified the saltHashPassword
function to accept a salt, only generating one if none is supplied.
Notice that we changed the function declaration here so that the first argument is now an object. We could have easily had the hash passed in as a second argument however it is good practice to pass options objects because it makes it more obvious what is happening in the function call.
Consider for instance if saltHashPassword
ends up at the bottom of the file, or is abstracted into another file. Which call is easier to understand?
// using two arguments
saltHashPassword('some password', '239ab09')// using an options argument
saltHashPassword({
password: 'some password',
salt: '239ab09'
})
Having the argument names in the call is nice.
To call /login
, let’s add a login form to the index.html
file.
<!DOCTYPE html>
<html>
<head>
<title>Node database tutorial</title>
</head>
<body>
<form class="Login">
<h1>Login</h1>
<input type="text" class="username" placeholder="username">
<input type="password" class="password" placeholder="password">
<input type="submit" value="Login">
</form>
<form class="CreateUser">
<h1>Create account</h1>
<input type="text" class="username" placeholder="username">
<input type="password" class="password" placeholder="password">
<input type="submit" value="Create">
</form>
<script src="/app.js"></script>
</body>
</html>
and let’s add logic to the app.js
to call the login endpoint
const CreateUser = document.querySelector('.CreateUser')
CreateUser.addEventListener('submit', (e) => {
e.preventDefault()
const username = CreateUser.querySelector('.username').value
const password = CreateUser.querySelector('.password').value
post('/createUser', { username, password })
})const Login = document.querySelector('.Login')
Login.addEventListener('submit', (e) => {
e.preventDefault()
const username = Login.querySelector('.username').value
const password = Login.querySelector('.password').value
post('/login', { username, password })
.then(({ status }) => {
if (status === 200) alert('login success')
else alert('login failed')
})
})function post (path, data) {
return window.fetch(path, {
method: 'POST',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
body: JSON.stringify(data)
})
}
By checking the status, we are now able to tell whether the user is authenticated or not.
Nice!
git add .
git commit -m "Login route"
Inspecting the SQL
Before wrapping up, let’s just take a look at some of the SQL queries being run under the covers here. In order to log a Knex query to the console, just add a .debug()
to the query chain.
For example, in store.js
you could add it to createUser
knex('user').insert({
salt,
encrypted_password: hash,
username
}).debug()
Now, when this method is called, you will see the SQL output to the console.
By looking at the sql key and replacing the ?
values with the values in the bindings array you can reconstruct the underlying SQL query that Knex runs to add a user.
Conclusion
In this tutorial we created a crude app for creating and logging in users using Node + Knex + MySQL. In the next tutorial we will add more functionality to the app and explore some more complex querying.
Eventually I hope to show how we can deploy the app to EC2 with a remote database.
If you liked this tutorial, then you might want to take a look at a couple of my earlier tutorials
Hacker Noon is how hackers start their afternoons. We’re a part of the @AMI family. We are now accepting submissions and happy to discuss advertising & sponsorship opportunities.
If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, don’t take the realities of the world for granted!