Making Airflow Pods Use a Private Google Cloud SQL Connection

Alexa Griffith
Apr 28, 2020 · 5 min read

Bluecore’s Data Science team uses Airflow for our model workflows. In our Airflow pods, we had been, until recently, using a Cloud SQL proxy as a sidecar container. The Cloud SQL connection handles database connections. We can get information about the state of a task or XComs, for example. Google has recently allowed users to connect to Cloud SQL using a VPC. Because of this, we decided to remove the proxy and implement a private connection to be more secure and save resources. As a result, Cloud SQL connections are no longer made via the localhost, and we stopped creating a sidecar container just for Cloud SQL connections.

web pod containing the Cloud SQL proxy (top left) versus the web pod using the private IP (bottom right)

XComs

XComs allow “cross-communication” of information between task instances. You can either “push” or “pull” an XCom.

def get_result_one(**kwargs):
# code to calculate some result
kwargs['ti'].xcom_push(key='some_key', value={result})
operator_1 = PythonOperator(
task_id='task1'
python_callable=get_result_one,
provide_context=True,
dag=dag,
)
def get_result_two(task_ids, **kwargs):
task1_result = kwargs['ti'].xcom_pull(key='some_key',
task_ids=task_ids)
# do something with the result operator_2 = PythonOperator(
task_id='task2',
op_args=[task_ids='task1'],
python_callable=get_result_two,
provide_context=True,
dag=dag
)
operator_2.set_upstream(operator_1)

In this simple example, we have two python operators. The first one pushes the result as an XCom, and the second uses that result in its code. XComs are useful for splitting up work, pulling results from other tasks, and sharing state.

Cloud SQL proxy

Now that we know why the Google Cloud SQL connection is important for SQL databases such as XComs, let’s get into how we implement the connection using a private IP. In our Airflow configuration, we defined a Google Cloud SQL proxy image, project, database instance, and region. Our database credentials are saved as a Kubernetes secret and mounted via YAML. Our web, scheduler, and worker nodes spin up a Cloud SQL proxy container defined in the YAML.

For example, in the deployment YAML, we set up a Kubernetes web pod containing the webserver, Github synchronization, and Cloud SQL proxy containers.

#deployment YAML 
- name: webserver
#webserver container information
-name: github-sync
#github synchronization container information
- name: cloudsql-proxy
image: cloudsql-proxy-image
command: ["/cloud_sql_proxy",
"-instances=our-instance",
"-credential_file=/path/to/cred/file"]
volumeMounts:
- name: airflow-cloudsql-instance-credentials
mountPath: /path/to/secret
readOnly: true
resources:
requests:
memory: "100Mi"
cpu: "100m"

Notice in the pseudo-code of the deployment YAML, we allocated 100 millicores or 10% of a core to the Cloud SQL Proxy. This is the smallest allocation to a container that you can make. Bluecore gets charged by the number of core hours we use, and with the container allocated 100 m, we will be charged regardless of if we are making a Cloud SQL connection or not. Luckily, we will avoid that by getting rid of the proxy.

Configuring a private IP with Cloud SQL

In the “SQL” tab in the Google Cloud Platform console, you can see all the information about your Cloud SQL instances. Since our proxy was already running on an existing instance, the setup steps weren’t necessary. We enabled the private IP connection in our Cloud SQL instance settings, created/downloaded the SSL certificates, and added those as a Kubernetes secret. Next, we need to

  • add the private host to the Airflow config
  • mount the SSL secrets in each of the pods that used the proxy
  • replace any environment variables that use localhost
  • remove the proxy configs that you no longer need

Our YAML contained an environment variable that set our SQL Alchemy connection. We need to change the hard-coded localhost to use our private IP. Since we have different environments, we made the IP configurable.

- name: AIRFLOW__CORE__SQL_ALCHEMY_CONN  value:"mysql://$(AIRFLOW_MYSQL_USERNAME):$(AIRFLOW_MYSQL_PASSWORD)@127.0.0.1/{{ airflow.db.name }}"

The value changed to

value:"mysql://$(AIRFLOW_MYSQL_USERNAME):$(AIRFLOW_MYSQL_PASSWORD)@{{ airflow.db.host }}/{{ airflow.db.name }}"

In our Airflow repo’s settings.py, I added support for SSL configs when making the SQL connection with the create_engine method.

engine_args['connect_args'] = {'ssl': {'ca': '/path/to/ca', 
'cert': '/path/to/cert',
'key': '/path/to/key'}}

Once the private IP and SSL secrets are added and the proxy configs are removed, it is time to deploy!

Note: You can also make use of the private host in your operators. Bluecore wrote its own Kubernetes Job Operator, and we spun up a Cloud SQL container for these as well, and we have written our own XCom helper methods. We went through the same process to implement the private IP — updating the variables and adding the SSL parameters.

Debugging

We deployed to our QA environment to test our new connection out. I found it helpful to log into the node using

kubectl exec -it {pod_name} bash 

Once logged into the node, we need to check that the volumes were mounted by searching for them in the paths set in the YAML file and try to run a query with the connection.

from airflow.models import DagRun 
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
e = create_engine($AIRFLOW__CORE__SQL_ALCHEMY_CONN, connect_args= {'ssl': {'ca': '/path/to/ca', 'cert': '/path/to/cert', 'key': '/path/to/key'}})s = scoped_session(sessionmaker(autocommit=False, autoflush=False, bind=e))s.query(DagRun).all()

This test proves that everything is set up correctly in the pod. If there is a connection error, make sure that all SQL connections are using the SSL parameters in the code.

For our Celery result backend, we use SQLAlchemy. There are other options for the result backend like Redis and Cassandra, which may not have the same issues we ran into using SQLAlchemy. In our Celery config file, we added the SSL parameters to the configs with the key database_engine_options.

'database_engine_options': {'connect_args': 
{'ssl':
{'ca': '/path/to/ca',
'cert': '/path/to/cert',
'key': '/path/to/key'
}
}
}

Connection errors persisted after adding the SSL parameters. It turns out, the Celery code doesn’t use this connection’s SSL parameters. The good news is we filed an issue, and it has been resolved! So, the fix should be in the next version! For other result backend databases, the documentation describes how to add SSL parameters.

Summary

  • Leveraging the Google Cloud SQL private IP could save money and resources
  • To configure the private connection, check the SQL database settings in GCP and update the Airflow configuration
  • Confirm everything works as expected by logging into the node and checking the logs
  • If you use the SQL connection for the Celery results backend, be aware that the SSL parameters should work in the next version.

Read more about what we do at Bluecore!

The Startup

Get smarter at building your thing. Join The Startup’s +785K followers.

By The Startup

Get smarter at building your thing. Subscribe to receive The Startup's top 10 most read stories — delivered straight into your inbox, once a week. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

Alexa Griffith

Written by

Software Engineer at Bluecore

The Startup

Get smarter at building your thing. Follow to join The Startup’s +8 million monthly readers & +785K followers.

Alexa Griffith

Written by

Software Engineer at Bluecore

The Startup

Get smarter at building your thing. Follow to join The Startup’s +8 million monthly readers & +785K followers.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store