Optimize Docker image for production with extra build production and system packages
I wanted to share from my experience how to optimize a docker image for production but first, if you have not read the following story from me, I invite you to take a look on how to create a multi-stage build
Explanation time!
That’s being said, we will take an example with a TypeScript project. Imagine the project needs to authenticate the user and you use the famous bcrypt package to hash your password.
So far, you use the following image to build it and everything was fine until you try to redeploy it and get an error with missing packages, binaries, and so on during the production stage.
The easy fix would be installing the requirements in the production build because packages are not present here.
RUN apk --no-cache add --virtual builds-deps build-base python
But if you do that and publish your image, your future images would go from 100 MB to 300 MB simply because this command brings 200 MB extra data in your image! That’s a lot!
Result
So to avoid such a case in every other situation, I recommend you to create a builder stage for the production. In this case, you will have a clean build to build the TS file, build a production package, and then finally just the production files.
Here is an example of my Dockerfile I use in CI and production
And that’s it. Your image will stay low 😎