Experimenting with using Docker for development and a small EC2 deployment -Part 1
This week I worked on a side project, aiming to Dockerize a Rails app for local development, swap in Webpack for building its assets, and write scripts that would build and deploy production containers to EC2.
This started from reading more about Kubernetes, and realizing it’d probably be good to get more practice with container fundamentals first. And the best way to learn is by experimenting! So this might not be be a great way to run a production system, but it was a great way to learn more about Docker, Docker Compose, Webpack, and the AWS CLI, and a particularly great way to figure out how to fit all the pieces together.
I used the open-source Student Insights project (demo) that the awesome Somerville Code For America fellows have been working on this past year. It helps teachers and administrators at a local K-8 school run more effective team meetings by presenting historical context about the student, and letting them track interventions to see how the student is progressing. If you want to learn more, you can watch an awesome talk by Uri and Alex. :)
- Dockerize Rails for local development, to make setup simpler.
- Replace the asset pipeline with Webpack, to test coordinating multiple containers locally.
- Build and deploy production containers to EC2, automating as much as possible including provisioning resources.
In the process, I learned quite a bit, so here’s some of what I learned. I’ll start with local development today, and then end with a sneak peak ahead about how the rest of this week turned out, and share more about that next week.
Dockerize Rails for local development
Alex and Mari had already written awesome documentation for how to set up the Student Insights project locally. But I had just upgraded to El Capitan, and so even basic things like installing a new version of Ruby involved updating homebrew, and that didn’t just work out of the box. I got that working, but then ran into problems building Nokokiri and then the Postgres gem, and thought Docker might come in useful here.
I already had VirtualBox, and some experience with Docker, so after installing some packages, it only took a minimal Dockerfile to get Rails running. I based it off the official Rails repository on Docker Hub, which was enticingly simple.
I ended up modifying this a little bit to remove things I didn’t need like the MySQL client, and also to do a few other steps like run bundle install. One quirk that came up is that it takes a while to build the container image when you naively copying the Rails project into the container, and then run bundle install afterward. This is because Docker immutably commits changes after each operation in the Dockerfile, so that it can avoid re-running each step on each build. In this case, a COPY operation invalidates further steps, and so any change to the project code meant that Docker needed to re-run any steps afterward in order to correctly build the container image. And so it meant running bundle install on each code change.
Reading around a bit, a quick re-ordering to COPY just the Gemfile and Gemfile.lock first means that only changes there can invalidate the expensive bundle install step. This is what I ended up with:
This sped things up quite a bit!
The next problem I ran into was that starting Rails required a Postgres database, and there wasn’t one running yet. There’s an official Postgres container image on Docker Hub, so it was simple to spin that up. And after reading up a bit I figured out how to open ports between containers and how to work with volumes to share files.
But what I really wanted was a one-line command to make everything work together. At Twitter, I used the awesome Galley project to do this (that Joan and Pete open-sourced), and I’d definitely recommend it. For this project I wanted to try something new, and was curious to try out Docker Compose (formerly fig), which is helps solve just this problem for local development. I ended up with this docker-compose.yml file, which essentially takes what I was typing on the command line to mount volumes and open ports, and describes in yml form:
With this, running the Student Insights app was as easy as pulling the repository and running docker-compose up. The only remaining bit was the initial seed of the database, which is straightforward to do by using docker-compose run rails bash to get a shell and then run rake tasks like usual.
I ran into a final snag when trying to make this seeding operation into a one-line command. It turns out that the docker-compose.yml file sensibly treats whatever happens within the container as a black box. And so here, if the Rails process started up faster than the Postgres process, it might try to open port 5432 on the Postgres container before there is anything listening. It looks as if a few folks have run into this (see here, and linked issues), and the official Docker direction is that this is still a user-land problem for now. I looked at adding a script to wait for Postgres to come up, but this seemed more complicated and didn’t work right away. Since this really only affects the first-time setup, I ended up working around and noting it in the README, suggesting folks run this as a two-step operation.
That’s all for today! Next week I’ll share what I did to replace the asset pipeline with Webpack, as well as what was the most interesting part for me: experimenting with building and running production containers in EC2.
For a sneak peak of what’s ahead, check out this pull request. Keep in mind that the goal here is to experiment with a minimal deployment and learn from it, so if you have other thoughts on how to approach it differently, or can help point me to other places to learn, please let me know!