How to make a movie recommender: connecting services using Docker

Docker is a great tool to make deployment and testing systems while keeping things clean. The code for this project is here, the code for this tutorial is in the same repository.

Hello Docker

So what is Docker? Docker itself is a platform, but we care about the Docker containers. So a Docker container is a lightweight execution environment for running code in a “sealed” environment. This means, its a fake computer that lives inside another computer that runs code without having to share configurations.

This means that we can run our backend, database, model and frontend as containers without having problems with configurations or dependencies. Because each service can run in a independent container.

To install Docker in your computer I recommend following this tutorial if you are using a Linux distribution. If you are using Windows, I would recommend this tutorial, and for Mac this one

How to make a Docker container for our services

To create a Docker container one must first create a Docker image, this is the configuration for our container. The great thing about Docker is that you can use images create by other people or companies to make things easier.

Let’s start by creating our backend container. In the folder where our backend code lives, we have to create a file named Dockerfile this is the default name for a image. The code for this image is the following:

FROM tiangolo/uvicorn-gunicorn-fastapi:python3.7COPY . .RUN pip install -r requirements.txtEXPOSE 8000CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]

The first line is the base image, the image we are basing our container from. In this case we are using the official FastAPI image. Then we are copying the files from our backend directory to the containers directory, which is two files our [main.py](<http://main.py>) and requirements.txt. Now we are using pip to install in our container the dependencies from our requirements. The fourth line, exposes the 8000 port to the network system, this will allow for our container to communicate to the rest of the system. Finally we are running the code in our backend.

Now we have to create our container, in a terminal inside the folder where our Dockerfile exist, run the following command:

docker build -t tutorial-backend .

(The tutorial-backend is the name of the container, you can put whatever you want)

And to run the container, you can use the following command:

docker run tutorial-backend

Lets look into our frontend container, since it is a Svelte application it is different from our backend but after looking at the code you will see how easy Docker is.

FROM node:12-alpineWORKDIR /usr/src/appCOPY package*.json ./RUN npm installCOPY . .EXPOSE 5000ENV HOST=0.0.0.0RUN npm run buildCMD [ "npm", "start" ]

Communicating container

It is great that we don’t have to worry about configuring all our services in one machine but they have to talk to each other. Like almost everything in programing, there is a tool for that and that tool is called Docker compose. If you are using a Linux machine, you need to install Docker compose (you can use this tutorial), everyone else it is already installed.

Docker compose allows us to run multiple containers that can talk to each other. All we have to do is describe our containers, how they communicate and other variables. To start one must create a file with the name docker-compose.yml. Here is all the code we need to run everything together:

version: "3"services:
tensorflow-servings:
image: tensorflow/serving:latest
ports:
- 8501:8501
environment:
- MODEL_NAME=movie_model
volumes:
- ./ai-model/model:/models/movie_model
depends_on: [mongo]
mongo:
image: "mongo"
container_name: "movieDB"
environment:
- MONGO_INITDB_DATABASE=movieRecommenderDB
volumes:
- ./mongo-volume:/data/db
ports:
- 27017:27017

backend:
build:
context: backend/
dockerfile: Dockerfile
image: movie-backend
ports:
- 8000:8000
depends_on: ["mongo"]
environment:
- MONGOHOST=mongo
- TF_SERVING_HOST=tensorflow-servings
frontend:
build:
context: frontend/
dockerfile: Dockerfile
image: movie-frontend
ports:
- 5000:5000
depends_on: ["backend"]

The first line defines the version of Docker compose, then we look at whats inside the service key. The first service is tensorflow-serving service, we start by giving our service a name, in this case tensorflow-serving. Following by giving the image that the service uses, we are using the official Tensorflow Serving image. Then we expose the ports, following defining the environment variables. The volume key is used to define the shared folder, in this case we want to share the folder where our trained model to use Tensorflow Serving. Finally, the depend value is to tell Docker compose all the containers needed to run before running this particular container.

Finally, to run everything we need the following command in the folder where the docker-compose.yml is:

docker-compose up

Now if you open a web browser and go to [<http://localhost:5000>](<http://localhost:5000>) you will be able to use your application and have fun watching movies.

Sign up for Analytics Vidhya News Bytes

By Analytics Vidhya

Latest news from Analytics Vidhya on our Hackathons and some of our best articles! Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Juan Domingo Ortuzar

Written by

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store