Welcome to the World of Containers: How to Set Up a Full Stack App Using Docker

Kenny B Cheng
Digital Products Tech Tales
11 min readJun 20, 2023
Photo by Mike Doherty on Unsplash

So, first things first. What is Docker?

Docker is a platform that allows you to package up your application code and all of its dependencies into these self-contained mini-images that you can then pull and scale up on any machine with the Docker platform installed.

Why use it?

In a smaller, more development-focused sense, Docker makes it extremely easy to run your application on different machines and environments. It packages everything you need to run your code into an image and as long as you run it on a Docker platform, it should run the same way every time.

In a larger, architectural sense, it makes scaling up applications easier. With the advent of microservices in modern tech, these “containers” are the best way to scale up your applications or services and move them from platform to platform.

With that being said, let’s get started on a relatively straight-forward example to map out the basics of Docker and show the advantages of using it.

  • List your services (i.e front-end app, back-end api)
  • For each service, create a Dockerfile that are instructions for building a container image
  • Create a docker-compose.yml file that will orchestrate all of the container services

Let’s say we want to build a full stack app that lists movies, actors and directors. This app consists of a database, a back-end and a front-end. We can use docker to orchestrate the standing up of the entire stack with a couple of command lines. And that complete app should then work on anyone else’s machine as well, without much of a fuss.

We have the following repo. We have three folders and a docker-compose.yml file. The api is the backend, the app is the front-end and db is the database.

├── api
│ ├── node_modules
│ ├── Dockerfile
│ ├── .env
│ ├── package.json
│ ├── package-lock.json
│ ├── server.js
├── app
│ ├── node_modules
│ ├── public
│ ├── src
│ │ ├── App.js
│ ├── Dockerfile
│ ├── package.json
│ ├── package-lock.json
├── db
│ ├── Dockerfile
│ ├── init.sql
│ ├── seed.sql
├── docker-compose.yml

As you can see, there is a Dockerfile in every one of our service folders. And we also have docker-compose.yml file that acts to orchestrate all of them together.

API

├── api
│ ├── node_modules
│ ├── Dockerfile
│ ├── .env
│ ├── package.json
│ ├── package-lock.json
│ ├── server.js

Our API is written in Node.js, so we have the node_modules folder, as well as our package management data in package.json. We then have our actual API server in server.js, as well as our Dockerfile.

This is our package.json file. It has “cors” to allow communications with the front-end, “dotenv” to easily read .env files, “express” as our API framework and “knex” and “pg” for our database interactions.

//package.json

{
"name": "api",
"version": "1.0.0",
"description": "",
"main": "server.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC",
"dependencies": {
"cors": "^2.8.5",
"dotenv": "^16.0.1",
"express": "^4.18.1",
"knex": "^2.2.0",
"pg": "8.7.3"
}
}

With your package.json filled out, you should then be able to run the following to install all of the packages.

npm i 

The server.js should look something like

//server.js

//Declaring our imports
const express = require('express');
const bodyParser = require('body-parser');
const cors = require('cors');
require('dotenv').config();

//Grabbing keys from .env file
const {
POSTGRES_USER, POSTGRES_PASSWORD, POSTGRES_DB, POSTGRES_HOST,
} = process.env;

//Sets up the config settings for the database connection
const config = {
development:{
client: 'pg',
connection: {
database: POSTGRES_DB,
user: POSTGRES_USER,
password: POSTGRES_PASSWORD,
host: POSTGRES_HOST,
port: '5432',
},
}
};
const database = knex(config);

//Creates the express app and adds on cors and bodyParser
const app = express();
app.use(cors());
app.use(bodyParser.json());

//These are the actual API endpoints
app.get('/', (req, res) => {
res.send('APIs are working');
})

app.get('/movies', async (req, res) => {
try {
const movies = await database('movie')
.leftJoin('director', 'director.id', '=', 'movie.director')
.select(['*','movie.id']);
res.send(movies)
} catch (error) {
res.send(error)
}
});

app.get('/directors', async (req, res) => {
try {
const directors = await database('director').select('*');
res.send(directors)
} catch (error) {
res.send(error)
}
})

app.get('/actors', async (req, res) => {
try {
const actors = await database('actor').select('*');
res.send(actors)
} catch (error) {
res.send(error)
}
})

app.post('/movies', async (req, res) => {
try {
const movie = await database('movie')
.insert(req.body)
.returning('*');
res.send(movie);
} catch (error) {
res.send(error)
}
})

app.post('/actors', async (req, res) => {
try {
const actor = await database('actor')
.insert(req.body)
.returning('*');
res.send(actor);
} catch (error) {
res.send(error)
}
})

app.post('/directors', async (req, res) => {
try {
const director = await database('director')
.insert(req.body)
.returning('*');
res.send(director);
} catch (error) {
res.send(error)
}
})

app.listen(3080, () => {
console.log('App has started up')
})

And the .env file can be whatever

POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DB=postgres
POSTGRES_HOST=postgres

Last thing for the API is the Dockerfile, which should look something like this

// This declares we want to build from a node image
FROM node:alpine

// This declares the working directory of the container we are building
WORKDIR /usr/src/app

//We want to copy over our npm dependencies
COPY package*.json ./

//We want to then install those dependencies
RUN npm install

//Next we want to copy over everything in our api folder
COPY . .

//We want to expose a port to make it hittable
EXPOSE 3080

//We then want to actually run our server.js,
//so that it's up and running as an API server
CMD [ "node", "server.js" ]

That should take care of the API side of things.

Next up, let’s look at the front-end. This is the app folder.

App

├── app
│ ├── node_modules
│ ├── public
│ ├── src
│ │ ├── App.js
│ │ ├── App.css
│ ├── Dockerfile
│ ├── package.json
│ ├── package-lock.json

Most of it is boilerplate created by running the following on the command line

npx create-react-app app

Similarly to our API, main points of interests are the package.json file, our app itself, which a pretty straight-forward App.js file and then the Dockerfile.

The package.json file should be very similar to boilerplate, with the addition of “axios” and “@mui/material” packages. Axios is a HTTP request library and material gives us out of the box components. So something like

//package.json

{
"name": "app",
"version": "0.1.0",
"private": true,
"dependencies": {
"@emotion/react": "^11.10.0",
"@emotion/styled": "^11.10.0",
"@mui/material": "^5.10.1",
"@testing-library/jest-dom": "^5.16.5",
"@testing-library/react": "^13.3.0",
"@testing-library/user-event": "^13.5.0",
"axios": "^0.27.2",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-scripts": "5.0.1",
"web-vitals": "^2.1.4"
},
...

You can install them with

npm i axios
npm i @mui/material @emotion/react @emotion/styled

App.js should look like so

//App.js

import './App.css';
import TextField from '@mui/material/TextField';
import Button from '@mui/material/Button';
import ButtonGroup from '@mui/material/ButtonGroup';
import React, {useEffect, useState} from 'react';
import Card from "@mui/material/Card";
import { CardContent, Typography } from '@mui/material';
import axios from 'axios';

const cardStyle = {
maxWidth:375,
minWidth:300
}

function App() {

const [category, setCategory] = useState("movies");
const [movies, setMovies] = useState([]);
const [postMovie, setPostMovie] = useState({});
const [directors, setDirectors] = useState([]);
const [postDirector, setPostDirector] = useState({});
const [actors, setActors] = useState([]);
const [postActor, setPostActor] = useState({});
const [isLoading, setIsLoading] = useState(true);

useEffect(() => {
axios.get(`http://localhost:3080/movies`).then(res=>{
setMovies(res.data);
setIsLoading(false);
console.log(movies);
})
},[category])

const chooseMovies = async () => {
setCategory("movies");
axios.get(`http://localhost:3080/movies`).then(res=>{
setMovies(res.data);
setIsLoading(false);
})
}

const postMovies = async () => {
await axios.post(`http://localhost:3080/movies`, postMovie);
const {data} = await axios.get(`http://localhost:3080/movies`);
setMovies(data);
}

const chooseDirectors = async () => {
setCategory("directors");
axios.get(`http://localhost:3080/directors`).then(res=>{
setDirectors(res.data);
setIsLoading(false);
})
}

const postDirectors = async () => {
await axios.post(`http://localhost:3080/directors`, postDirector);
const {data} = await axios.get(`http://localhost:3080/directors`);
setDirectors(data);
}

const chooseActors = async () => {
setCategory("actors");
axios.get(`http://localhost:3080/actors`).then(res=>{
setActors(res.data);
setIsLoading(false);
})
}

const postActors = async () => {
await axios.post(`http://localhost:3080/actors`, postActor);
const {data} = await axios.get(`http://localhost:3080/actors`);
setActors(data);
}


const renderPost = () => {
if (category === 'movies')
return (
<div>
<TextField style={{padding:10}} label='title' variant='filled' onChange={event=>setPostMovie({...postMovie,title:event.target.value})}/>
<TextField style={{padding:10}} label='year' variant='filled' onChange={event=>setPostMovie({...postMovie,year:event.target.value})}/>
<Button
style={{padding:20}}
variant='contained'
onClick={()=>{
postMovies()
}}>Submit</Button>
</div>
)
else if (category === 'actors')
return (
<div>
<TextField style={{padding:10}} label='first_name' variant='filled' onChange={event=>setPostActor({...postActor,first_name:event.target.value})} />
<TextField style={{padding:10}} label='last_name' variant='filled' onChange={event=>setPostActor({...postActor,last_name:event.target.value})}/>
<Button
style={{padding:20}}
variant='contained'
onClick={()=>{
postActors()
}}
>Submit</Button>
</div>
)
else if (category === 'directors')
return (
<div>
<TextField style={{padding:10}} label='first_name' variant='filled' onChange={event=>setPostDirector({...postDirector,first_name:event.target.value})}/>
<TextField style={{padding:10}} label='last_name' variant='filled' onChange={event=>setPostDirector({...postDirector,last_name:event.target.value})}/>
<Button
style={{padding:20}}
variant='contained'
onClick={()=>{
postDirectors()
}}
>Submit</Button>
</div>
)
}

const renderComp = () => {
if (category === 'movies')
return (
<div className="center-p">
{movies && movies.map(movie => {
return (
<Card sx={cardStyle} className="card">
<CardContent>
<Typography sx={{fontSize:14}} gutterBottom>
Movie Title: {movie.title}
</Typography>
</CardContent>
<CardContent>
<Typography sx={{fontSize:14}} gutterBottom>
Year: {movie.year}
</Typography>
</CardContent>
<CardContent>
<Typography sx={{fontSize:14}} gutterBottom>
Director: {`${movie.first_name} ${movie.last_name}`}
</Typography>
</CardContent>
</Card>
)
})}
</div>
)
else if (category === 'actors')
return (
<div className="center-p">
{actors && actors.map(actor => {
return (
<Card sx={cardStyle} className="card">
<CardContent>
<Typography sx={{fontSize:14}} gutterBottom>
First Name: {actor.first_name}
</Typography>
</CardContent>
<CardContent>
<Typography sx={{fontSize:14}} gutterBottom>
Last Name: {actor.last_name}
</Typography>
</CardContent>
</Card>
)
})}
</div>
)
else if (category === 'directors')
return (
<div className="center-p">
{directors && directors.map(director => {
return (
<Card sx={cardStyle} className="card">
<CardContent>
<Typography sx={{fontSize:14}} gutterBottom>
First Name: {director.first_name}
</Typography>
</CardContent>
<CardContent>
<Typography sx={{fontSize:14}} gutterBottom>
Last Name: {director.last_name}
</Typography>
</CardContent>
</Card>
)
})}
</div>
)
}

return (
<div className="App">
<header className="App-header">
<ButtonGroup variant="contained">
<Button onClick={()=>chooseMovies()}>Movies</Button>
<Button onClick={()=>chooseDirectors()}>Directors</Button>
<Button onClick={()=>chooseActors()}>Actors</Button>
</ButtonGroup>
</header>
{renderPost()}
<div>
{category}
</div>
{renderComp()}
</div>
);
}

export default App;

App.css is set up as so.

.App {
text-align: center;

.center-p {
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
}

.card {
margin: 5vh;
}

And now for the Dockerfile. This Dockerfile is also quite similar to our api one.

FROM node:alpine

WORKDIR /usr/src/app

COPY package*.json .

RUN npm install

COPY . .

EXPOSE 3000

CMD ["npm", "start"]

With the Dockerfile set up for the app folder, we can move on to our final piece. The database.

DB

├── db
│ ├── Dockerfile
│ ├── init.sql
│ ├── seed.sql

The db folder’s three parts are the init.sql, the seed.sql and once again, the Dockerfile.

The init.sql file initializes all the tables for the database. This includes the movie, actor and director tables. The seed.sql will then populate said table with dummy seed data.

They look like so, the init.sql

//init.sql

CREATE EXTENSION IF NOT EXISTS "uuid-ossp";

CREATE TABLE "movie"(
"id" UUID NOT NULL DEFAULT uuid_generate_v4(),
"title" VARCHAR(255) NOT NULL,
"director" UUID NULL,
"year" INTEGER NOT NULL
);
ALTER TABLE
"movie" ADD PRIMARY KEY("id");

CREATE TABLE "director"(
"id" UUID NOT NULL DEFAULT uuid_generate_v4(),
"first_name" VARCHAR(255) NOT NULL,
"last_name" VARCHAR(255) NOT NULL
);
ALTER TABLE
"director" ADD PRIMARY KEY("id");


CREATE TABLE "actor"(
"id" UUID NOT NULL DEFAULT uuid_generate_v4(),
"first_name" VARCHAR(255) NOT NULL,
"last_name" VARCHAR(255) NOT NULL
);
ALTER TABLE
"actor" ADD PRIMARY KEY("id");


CREATE TABLE "actor_movie"(
"actor_id" UUID NOT NULL,
"movie_id" UUID NOT NULL,
UNIQUE ("actor_id", "movie_id")
);

ALTER TABLE
"movie" ADD CONSTRAINT "director_foreign" FOREIGN KEY("director") REFERENCES "director"("id");
ALTER TABLE
"actor_movie" ADD CONSTRAINT "actor_foreign" FOREIGN KEY("actor_id") REFERENCES "actor"("id");
ALTER TABLE
"actor_movie" ADD CONSTRAINT "movie_foreign" FOREIGN KEY("movie_id") REFERENCES "movie"("id");

The seed.sql

SELECT uuid_generate_v4() AS d1 \gset
SELECT uuid_generate_v4() AS d2 \gset
SELECT uuid_generate_v4() AS d3 \gset
SELECT uuid_generate_v4() AS d4 \gset


insert into "director" (id, first_name, last_name)
values
(:'d1','Denis', 'Villenueve'),
(:'d2','Halina', 'Reijn'),
(:'d3','David', 'Fincher'),
(:'d4','Harald', 'Zwart');

SELECT uuid_generate_v4() AS m1 \gset
SELECT uuid_generate_v4() AS m2 \gset
SELECT uuid_generate_v4() AS m3 \gset
SELECT uuid_generate_v4() AS m4 \gset

insert into "movie" (id, title, director, year)
values
(:'m1', 'Dune', :'d1', 2021),
(:'m2', 'Bodies Bodies Bodies', :'d2', 2022),
(:'m3', 'The Social Network', :'d3', 2014),
(:'m4', 'Agent Cody Banks', :'d4', 2003);

SELECT uuid_generate_v4() AS a1 \gset
SELECT uuid_generate_v4() AS a2 \gset
SELECT uuid_generate_v4() AS a3 \gset
SELECT uuid_generate_v4() AS a4 \gset

insert into "actor" (id, first_name, last_name)
values
(:'a1', 'Timothy','Chalamet'),
(:'a2', 'Rachel', 'Sennot'),
(:'a3', 'Andrew', 'Garfield'),
(:'a4', 'Frankie', 'Muniz');

insert into "actor_movie" (actor_id, movie_id)
values
(:'a1',:'m1'),
(:'a2',:'m2'),
(:'a3',:'m3'),
(:'a4',:'m4');

And then the Dockerfile

FROM postgres:latest

ADD init.sql /docker-entrypoint-initdb.d
ADD seed.sql /docker-entrypoint-initdb.d

RUN chmod a+r /docker-entrypoint-initdb.d/*
EXPOSE 5432

The first line grabs a Postgres image and builds your container from there. The following lines add the init.sql and seed.sql to your container and then adds read permissions to access the files. And then the last line exposes the port to allow access to the database.

With all of the services set up, the last piece of the puzzle is…

docker-compose.yml

This file basically orchestrates the building and setting up of all the containers we were previously working on. This file allows us to stand up the database, the back-end and the front-end with a couple of commands.

It looks like so

version: '3.8'
services:
postgres:
build: ./db
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=postgres
ports:
- '5432:5432'
api:
build: ./api
ports:
- "3080:3080"
app:
build: ./app
container_name: app
ports:
- "3000:3000"
stdin_open: true

In here, we define all of the services of our application. That’s the db/postgres, the api and the app.

Each service has a build value. As you can see in postgres, it is a value of ‘./db’

That means it goes into the db folder and looks for the Dockerfile and then uses that to build the container image. This goes the same for api and app.

The postgres service also has an environment, which you can use to set database users.

And then every service also has a port, which can be exposed to a corresponding port on localhost. So, for example, “3080:3080” grabs the port in the container which exposes the API in the container, but then forwards it to that same port in your local instance. This essentially means we can access the stuff in the container using localhost.

With the docker-compose file done, we can now start the whole thing up.

Running the docker-compose file

Make sure you’re in the root of your repo where you can see the docker-compose.yml. Make sure your Docker engine is running.

You should now be able to run these three commands to spin up your containers.

First up is

docker compose down

Since this should be the first time running, this shouldn’t matter, but it shuts down all containers currently up.

Once they are down you can run

docker compose build

This builds your Docker image and should look something like

Once that is finished, we should be able to stand up our containers with

docker compose up

This should look like so

And it should be continuously up on your terminal. You should now be able to visit below and see the website fully functioning with API calls to the backend and a database that saves everything.

http://localhost:3000/

So that’s pretty much it. Docker is a very powerful tool to have in modern application development and hopefully this was a good introduction to what it can do and how to use it.

If you want to learn more about Docker, here are some great resources.

Downloading Docker

--

--