Creating and Deploying a Full Stack Web Application

Using Docker, PostgreSQL, Express, React, and NGINX

Nico Zessoules
The Startup
15 min readJan 17, 2021

--

Getting Started

Welcome! This is a step by step guide on creating and deploying a full stack web application using Docker, PostgreSQL, Express, React, and NGINX. To demonstrate the underlying concepts we’ll build a project named Present, with the objective to create a browser user interface that displays the current database time on the press of a button. To get a better sense of what we’ll build you can try the online demo. Also, you can refer to the github repo if you get stuck and please share any questions or thoughts in the comments!

This article assumes some familiarity with the core underlying technologies (Docker, PostgreSQL, Express, React, and NGINX). However, the instructions are explained step by step so anyone is encouraged to learn along the way. The project is best used as an educational tool and scaffold to build your own ideas on. In this respect, it is minimal and may require additional configuration for your specific needs.

Requirements

The requirements for the project are as follows:

  1. Run the project entirely within a docker-compose orchestration. For simplicity this will encapsulate all initialization, so only Docker is required to build/run the entire project with one command.
  2. Create a PostgreSQL container with specific admin credentials.
  3. Create an Express container that can query the PostgreSQL container through Docker internally, and create an HTTP GET endpoint named /now to encapsulate this functionality.
  4. Create a React container with a single button and paragraph. Pressing the button will send a request to the /now API endpoint, returning the current database time. Then the result will be displayed in the paragraph.
  5. Enable hot-reloading in the development environment for the api and ui, so we can see the code changes live without restarting the containers.
  6. Create a separate production environment and deploy it to a cloud server so the project is publicly accessible online.

For reference, here’s a screenshot of the final UI:

Structure

This project is organized into a mono-repo directory pattern, so the entire project will contain multiple distinct apps (named db, api, and ui) within a single repository. To distinguish which we’re talking about at any time we’ll use the terms project and app strictly in relation to the entire project vs a distinct app. Below is a crude example of the directory structure:

Below is an diagram of the container structure with exposed port numbers:

Solid lines show container connections, and dashed lines show volumes

Prerequisites

Feel free to skip any step if the software is already installed on your computer.

Install Docker

First install Docker to power the project. This will also install docker-compose, which we’ll use for orchestration. Please follow the installation instructions.

Install psql

Next, install psql in order to test that the Database is running correctly. There are many ways to install this tool, but I recommend following these installation instructions for your specific operating system.

Install Node

Lastly, install Node in order to use the npm and npx command-line tools. Since we’re running the project entirely within Docker, we won’t run on the locally installed Node engine at all. For this reason it’s important we match the local version of Node (14.14) to the version used in the containers. There are two installation methods:

  1. The recommended method is using the nvm tool. This is a command-line tool built to easily switch between local Node versions. Please follow these installation instructions.
  2. You may also manually install a specific version. However, this is not recommended because you would need to reinstall to change versions.

Database Setup

I like working up from the bottom of the stack, so let’s start with the database. The goal of this section is to run a PostgreSQL Docker Container, and query the db for the current time. Also, we’ll initialize the project configuration by creating the .env and docker-compose.yml files.

Environment Variables

First let’s setup the env vars for the db. Create a file named .env in the present directory with the following content (feel free to replace the values I use with your own more secure credentials):

Docker Compose

Next, let’s configure the container within the docker-compose orchestration. As the PostgreSQL Docker Hub documentation explains, the image configures the admin credentials with the POSTGRES_USER and POSTGRES_PASSWORD env vars, so we will set these to the env vars we set above. As we continue there will be other env vars with specific names that we will map similarly, although not all share this property. Create a file named docker-compose.yml with the following content:

Here’s a quick breakdown of what this does:

  • image: postgres:13.1 Specifies the specific PostgreSQL version.
  • environment: — POSTGRES_USER — POSTGRES_PASSWORD ... Passes and renames specific env vars into the container, so it isn’t polluted with unneeded extras.
  • ports: — $DB_PORT:$DB_PORT Maps the container port to your local machine, using the env var we set earlier.
  • volumes: — ./db:/var/lib/postgresql/data Mounts a named volume from the container’s internal PostgreSQL data directory to the local machine, so it will be persisted between container restarts. Without this step the database would reset to fresh-install defaults with each restart. Also if you are following along using git, then make sure to include the data directory in your .gitignore file.
  • restart: always Force restart the container each time it’s spun-up, specified by the PostgreSQL Docker Hub documentation.
  • command: -p $DB_PORT Specifies the running port for PostgreSQL, using the env var we set earlier.

Run and Confirm

Finally the db container is configured and ready to run, let’s start the project with the command docker-compose up. After a few minutes the resulting screen should look like this:

Let’s confirm everything is working correctly by querying the db directly. To do this we’ll use the psql tool (installed as a prerequisite) in conjunction with the NOW() PostgreSQL function. In order to query the database for the current time, open a new terminal window and run the command psql -h 0.0.0.0 -p 5432 -d postgres -U myadmin -c "SELECT NOW()".

Note: Remember to replace the -U myadmin with whatever username you set in your .env file earlier. Similarly, use your own password when prompted.

The results should look like this (with the correct time):

🎉 Great job setting up the db container! To stop the project press ctl + C (or docker-compose down if you ran in--detach mode). Also, we can double check it is no longer running with the command docker container ls, or running the psql command again to make sure the connection fails.

Note: Other useful commands for debugging are docker-compose down -v which clears the container volumes, and docker-compose up --build which rebuilds the image before starting.

API Setup

The goal of this section is to create an Express app with a single HTTP GET endpoint named /now. This endpoint will query the db for the present time and return the result.

Express

First let’s create an Express app by following these steps:

  1. Initialize Node App: Create a new directory named api within the present directory, then initialize Node by running npm init -y within it.
  2. Install Dependencies: We’ll need a few packages to run Express, connect to a PostgreSQL database, and allow cross-origin resource sharing. Install these by running npm i express pg cors in the api directory.
  3. Install Dev Dependencies: We’ll also need a dev dependency to allow hot-reloading. Similarly, install it by running npm i -D nodemon.
  4. Create Run Script: Add "dev": "nodemon" into the scripts section of the package.json file. Then change the main line to "main": "src/index.js".
  5. Write Source Code: Now we’re ready to write the Express app. The JavaScript below is based on the Getting Started documentation for both Express and node-postgres. Create a directory named src in the api directory, then create a file named index.js in src with the following:

Environment Variables

Next let’s update the .env file. Append the following to the .env file:

Dockerfile

Let’s create a file named Dockerfile in the api directory to configure the Docker image. Copy the following into the file (based of the Node Docker Hub documentation):

Note: We only COPY the package*.json dependency files (specifically package.json and package-lock.json) because we’ll setup a named volume in the next section to enable hot-reloading.

Dockerignore

Now let’s create a .dockerignore file in the api directory to make our container more light-weight. Copy the following into the file: node_modules.

Docker Compose

As we discussed earlier, several of these env vars have specific names. NODE_ENV is defined by Node and will be used by both the api and ui, and the PG env vars are used by node-postgres as connection parameters. Also, you may notice the PGHOST value isn’t a traditional host (eg. 0.0.0.0) because we are leveraging the docker-compose orchestration to communicate internally. Let’s configure the container orchestration by adding the following to the docker-compose.yml file:

Note: Be sure to use the correct the spacing because .yml files are picky. The api service should be on the same level as db from before.

Here’s a quick breakdown of what this configuration does:

  • depends_on: — db Allows the db and api to communicate through Docker internally. Also waits to start the api container until after the db container has started.
  • build: ./api Specifies the context path (including the Dockerfile and .dockerignore).
  • environment: — NODE_ENV — PORT=$API_PORT ... Passes specific env vars into the container, so it isn’t polluted with unneeded extras.
  • ports: — $API_PORT:$API_PORT Maps the container port to the local machine.
  • volumes: — ./api:/app Mounts a named volume from the the container’s internal /app directory to the source files. This allows the container to detect changes to files, enabling hot-reloading.
  • volumes: — /app/node_modules Mounts a data volume to the node_modules directory, overwriting this directory in the named volume from the previous step. This ensures we have the correct binaries for the Linux based container as opposed to the local OS. Also this saves the content of the container’s node_modules directory, so the dependencies don’t have to be re-installed with each restart.

Run and Confirm

Finally the Express app is configured and ready to run. Again, let’s start the project with the command docker-compose up. This time more text will popup, but the end result should look like this:

Note: Your logs may be out of order depending on initialization time.

Let’s confirm everything is working correctly by connecting to the /now endpoint with a HTTP GET request. We do this by opening a browser and copying http://localhost:5000/now into the address bar. Your results should appear in the browser, and look like "YYYY-MM-DDTHH:MM:SS...".

🎉🎉 Hurray! We’ve setup the project’s back-end, both the db and api running and communicating through Docker.

UI Setup

The goal of this section is to create a React app, consisting of a button and paragraph. Pressing the button will send an HTTP GET request to the /now api endpoint, returning the current database time. Then the result will be displayed in the paragraph.

React App

First let’s create a React app that is capable of connecting to the api following these steps:

  1. Create React App: Create the React app by running npx create-react-app ui in the present directory. This will create the React app within a new directory named ui (it may take a few minutes to complete).
  2. Write App Component: Write the component by replacing the contents of the ui/src/App.js file with the following:

Environment Variables

Next let’s update the .env file with the new env vars we’ll need, where REACT_APP_API_HOST follows a specific naming convention.

Dockerfile

Now let’s create a file named Dockerfile in the ui directory with the following content:

Note: This file should be an exact copy of the one we created in the api directory with the exception of the CMD value.

Dockerignore

Now let’s create a .dockerignore file in the ui directory to make our container more light-weight. Unlike the api, we’ll also add build for when we need it in the deployment section. Copy the following into the file:

Docker Compose

Next let’s configure the ui within the orchestration by adding the following to the docker-compose.yml file:

This configuration follows the patterns in the api, so we won’t review what everything means. However there is one additional parameter stdin_open: true , which allows the dev server to run properly once the container is started (instead of exiting with code 0).

Run and Confirm

Finally the React app is configured and ready to run. Again, let’s start the project with the command docker-compose up. This time there will be more text, the end result should look like this:

Confirm everything is working correctly by loading the UI and pressing the button. We do this by opening a browser and copying http://localhost:3000 into the address bar. The result should appear in the browser and look like the screenshot in the Getting Started section (after pressing the button).

🎉🎉🎉 Congratulations! We’ve now successfully setup the entire development environment for the project entirely within Docker!

Production Deploy

The goal of this section is to create an Ubuntu virtual machine to host the project online. We’ll also need to add production configuration for the container builds, orchestration, and environment variables.

In this orchestration the db and api only have minor changes, while the ui is completely refactored. Specifically we’ll replace the single ui container (running the dev server) with a ui_build container that creates a static build and mounts it to the VM, and a ui_server container which runs NGINX to host the static build. Below is a diagram of the orchestration:

Solid lines show container connections, and dashed lines show volumes

API Script

First let’s add a new script to the api package.json file: "prod": "node .". This runs the app with node rather than nodemon, since hot-reloading should not be enabled in a production environment.

Dockerfiles

Next let’s configure the production orchestration by creating a new file named Dockerfile.prod within the api directory with the following content:

This file is based on our development Dockerfile with the following differences:

  • Tag the image as -alpine, making it more lightweight. We’ll do this with all our production images.
  • COPY all files (not just package*.json) because we don’t need hot-reloading, so there’s no need to create a named volume.
  • RUN npm i with the —-production flag so unneeded dependencies aren’t installed (in our case this excludes nodemon).
  • Change the CMD to the prod script we created above.

Similarly, let’s create another file named Dockerfile.prod in the ui directory with the following content:

Docker Compose

Let’s create a file named docker-compose.prod.yml in the present directory with the following:

Take the time to read through and understand this file. We have covered each of these parameters in previous sections, with the following exceptions:

  • build: context: and build: dockerfile: Dockerfile.prod allow us to specify a custom Dockerfile name (Dockerfile.prod).
  • The ui_server’s ports: — 80:80 are intentionally non-dynamic because 80 is the default port for HTTP. It’s possible to change the NGINX listening port by overwriting the nginx.conf, but that’s outside the scope of this project (learn more in the NGINX Docker Hub documentation).

Run and Confirm Locally

Now that we’ve setup the production orchestration, let’s run it locally with the command: docker-compose -f docker-compose.prod.yml up. Once the build is complete, confirm everything is working correctly by opening a browser and copying http://localhost into the address bar (remember the default port in our production environment is 80). The result should look and behave identically to the development environment.

Setup VM

We’re ready to setup the online VM. Specifically we will run an Ubuntu VM with Docker installed. To host the VM I’ll be using the Vultr platform so I recommend beginners follow along with Vultr. Alternatively there are many ways to do this (Digital Ocean, AWS, Google Cloud Platform, etc.) so feel free to use whatever platform you like. If you don’t have an account yet, please use one of the affiliate links below to gain some free credit:

After logging in to Vultr, click the Deploy a New Server button to get started. On the create page, use the following configuration:

  • Server: Cloud Compute.
  • Location: Optional, I selected New York.
  • Type: 64 bit OS, Ubuntu 20.10.
  • Size: 25 GB SSD, $5/month 1 CPU, 1024MB memory.
  • Hostname & Label: Optional, I wrote Present.

Click Deploy Now and wait for the server to start.

Once the VM has started, ssh into it by running ssh root@<YOUR_IP> (you can find your IP and password in the Vultr Server Details page). After connecting to the VM install Docker by following the Ubuntu Dock installation instructions, and install Docker Compose by following the Docker Compose installation instructions.

Next, we’ll copy the source code into the VM. The recommend tool for this git (if you have not been following along with git feel free to use my repo). However if you are opposed to using git, you can scp the entire project from your local machine.

Environment Variables

Now we have the entire project source code in the ~/present directory on the VM. The last touch is to write the production env vars in a new file (this isn’t committed to git since it shouldn’t be publicly available). Let’s create a new file named .env.prod with the following structure:

Remember to replace your credentials with more secure ones, and replace <YOUR_IP> with the IP of your VM. The only excluded env var is the UI_PORT, since HTTP defaults to port 80 as discussed earlier.

Run and Confirm

Let’s run the project with our new .env and docker-compose files with the following command:

Blast off! 🚀 This will take a few minutes to start, but once the build is complete you will be able to view the project in your browser at http://<YOUR_IP>. Check it out on multiple computers, even your phone!

Conclusion

Please remember that this project is designed to be a minimal scaffold to build your ideas on, and learn about more complex orchestrations. I strongly encourage diving into Docker Hub and adding other containers into the orchestration (some good starting places are MongoDB or Redis). I hope you found this article useful and please leave a comment on how you were able to expand on the project!

--

--