Airflow to Google App Engine and Back Again

At Bluecore, we’ve been using Airflow to simplify some of our most complex data workflows. Over the past year we’ve been able to work out a few of the more complicated kinks involving Airflow Operator bugs and implementation decisions. Engineers across the team have benefitted from quicker iteration speed and more stable pipelines by creating their own flexible Docker images and controlling their own deployment, execution, and testing. But even though Airflow usage continued to grow across the team, we realized we still had a large problem: legacy code!

Most of our application still lives on Google App Engine Standard. Moving forward, new projects are slated to run on Kubernetes and a few have plans to be migrated over. But, there is still a significant portion of our codebase that we wouldn’t be able to schedule through Airflow directed acyclic graphs (DAGs). We realized we needed a way to write DAGs that could execute code on Google App Engine in order to make Airflow a useful tool for all of the engineers at Bluecore.

Nothing Good Ever Came Easy

At first glance, this seems like an easy problem to solve: we could just have Airflow hit the Google App Engine application with an HTTP request. We’d write some code in App Engine that would handle requests from Airflow, route the requests to existing functionality in App Engine, and return a useful response.

And this is what we did! Until we realized it didn’t work. Google App Engine instances, spawned via automatic scaling, have a 60-second deadline for HTTP requests. While this worked for a few tasks we wanted to trigger from Airflow, a lot of the existing functionality would take more than 60 seconds to execute. If we chose to throw the execution of these tasks onto Task Queues instead, we lost our ability to monitor task execution or relay any type of return value. This meant we wouldn’t know if a task had executed at all, whether it succeeded or failed, or have access to its return value! Obviously, this wasn’t a workable solution for 99% of our desired workflows.

Our asynchronous setup would look like this:

Implementing Asynchronous Calls

Naming the solution was fairly easy: we need to execute longer running tasks asynchronously on App Engine while still monitoring their progress and allowing return values. Implementing the solution posed more challenges.

Luckily, Airflow itself provided an example for how this should be done. In general, individual tasks in Airflow do not communicate with one another. Task B does not know or use any information from Task A, aside from potential Trigger Rules. But, there is a workaround. For DAGs where you need to communicate information between tasks, you can use Airflow’s XComs. XComs allow individual tasks in a DAG to write information to a shared database, making that information available to all tasks in that DAG. We realized that if we utilized XComs, and wrote to this shared database from tasks being executed asynchronously in App Engine, we would be able to track task execution and read return values.

We had to make a few tweaks to our local instance of Airflow to make this happen:

  1. We brought our Airflow database into Google Cloud Platform by hosting it on CloudSQL. This step allowed us to read and write to the Airflow database from Kubernetes, which hosts our Airflow instance, and from Google App Engine, where we wanted to execute code.
  2. We wrote a wrapper around the existing Airflow request handler living in our Google App Engine code. This wrapper would write the progress and result of an individual task being executed in App Engine to the new Airflow database. Then, Airflow, instead of waiting for an HTTP response like it did previously, would poll the Airflow database.
  3. We created an App Engine Airflow Operator to be used in any Airflow DAG that wanted to execute code in App Engine. This abstracted away any XCom complexity from being forced on the average Airflow developer.

Putting all of this together, hitting Google App Engine from Airflow now looks like:


With the new and improved (read: working) App Engine Airflow Operator, we are now able to leverage the full functionality of our App Engine code from Airflow without excessive code replication and dependency wrangling. A significant portion of our workflows that will live in App Engine for the foreseeable future are now able to be executed via Airflow. For now, we don’t see any limits to what we can do at Bluecore using Airflow!

Interested in working with us? Check out our careers page here: