Deploy ML model using Docker and Google App Engine
This article focuses on deploying Machine Learning model using Docker containers, Streamlit, Google App Engine. If you are interested in deploying your model using Google Cloud Functions, please check out my article here.
If you are interested in jumping straight to the code, please visit here.
Google App Engine
Google Cloud Platform is very user friendly. Deploying your app with App Engine is as simple as clicking a button. Just a single command line gcloud app deploy does it all. It scales automatically and you only pay for what you use.
Streamlit
Building web apps with streamlit is easy and doesn’t require a lot of web development framework code. With just few lines of code, you can develop an app in minutes. In this article, we will have our model inference code interact with Streamit as a web application.
Docker
Docker helps to wrap your app and all its dependencies inside a container which you can then run on any environment. We will use Docker to containerize our app and deploy to Google App Engine.
Here are some pre-requisites that you need to consider before you begin :
- Assuming you already have a GCP account. If you don’t have one, please sign up here which comes with $300 credit.
- Install google cloud SDK to interact with google cloud from your local.
- Install Docker from here.
- Install streamlit from here.
- Trained ML model.
For brevity, I have generated a ML model which takes an image as input and outputs caption for that image. Similar code to train an image captioning model can be found here.
Step 1 : Run Streamlit App locally
Create a folder on your local machine and clone all the files required from here. We will be testing our streamlit application on local machine before deploying it to the App Engine.
Let’s take a look at app.py
We have utilised streamlit’s library to create headers, title, upload image to the app etc. You can add more widgets as necessary.
We have loaded the tokenizer file and added the model’s inference code in app.py. Run streamlit run app.py from your terminal which will open up the Streamlit web app from your browser.
Step 2 : Dockerize Streamlit App and test locally
We need a Dockerfile which contains instructions to build a container. Let’s take a look at our Dockerfile.
The instructions are pretty self explanatory. Next, we need to build the docker image from the Dockerfile. Run docker build -f Dockerfile -t [docker_image_name] . from your terminal. This process can take a while, it runs each instruction line by line from the Dockerfile.
Once the build is complete, you can run the docker image using this command docker run -p 9999:8080 -ti [docker_image_name]
This command runs Docker image and maps port 9999 on your local machine to container port 8080. This will display the streamlit web app on your browser at http://localhost:9999/ from the container (running instance of the image).
As a note, you could also view a list of docker containers running using this command docker ps -a
If everything works until this step without any errors on your local machine, we can proceed to the last step which is to deploy our app on App Engine.
Step 3 : Deploy Streamlit app to Google App Engine
We need app.yaml file which contains configuration settings for the app engine. You can find guidelines here and here.
You could leave the resources to default settings if yours is a simple model. Since we have used Transformers here, it requires more memory to perform inference.
You can deploy the app using the command gcloud app deploy
This command pushes the docker image to the cloud and stores it in Google Container Registry and deploys the container on the app engine. This can take a while to run. Now, you can see the app on the destination url.
Every time you run gcloud app deploy, it creates a new version as you can see here. Remember to STOP serving once you finished playing with the app.
References -
Streamlit app widgets code inspired from Daniel Bourke video on youtube — https://www.youtube.com/watch?v=fw6NMQrYc6w&t=3867s
I am still exploring and learning deployment strategies. Feel free to pour in your suggestions in the comment box.