Tip #18: Load and use JSON data in your workflow from GCS

Following up the article on writing and reading JSON files in cloud storage buckets, we saw that we could access the data of the JSON file, and use it in our workflow. Let’s have a look at a concrete use of this.

Today, we’ll take advantage of this mechanism to avoid hard-coding the URLs of the APIs we call from our workflow. That way, it makes the workflow more portable across environments.

Let’s regroup the logic for reading and loading the JSON data in a reusable subworkflow:

read_env_from_gcs:
params: [bucket, object]
steps:
- read_from_gcs:
call: http.get
args:
url: ${"https://storage.googleapis.com/download/storage/v1/b/" + bucket + "/o/" + object}
auth:
type: OAuth2
query:
alt: media
result: env_file_json_content
- return_content:
return: ${env_file_json_content.body}

You call this subworkflow with two parameters: the bucket name, and the object or file name that you want to load.

Now let’s use it from the main workflow. We need a first step to call the subworkflow to load a specific file from a specific bucket. The subworkflow below will return the content of the JSON data in the env_details variable.

​​main:
params: [input]
steps:
- load_env_details:
call: read_env_from_gcs
args:
bucket: workflow_environment_info
object: env-info.json
result: env_details

Imagine the JSON file contains a JSON object with a SERVICE_URL key, pointing at the URL of a service, then you can call the service with the following expression: ${env_details.SERVICE_URL} as shown below.

- call_service:
call: http.get
args:
url: ${env_details.SERVICE_URL}
result: service_result
- return_result:
return: ${service_result.body}

This is great for avoiding hardcoding certain values in your workflow definitions. However, for true environment-specific deployments, this is not yet ideal, as you would have to point to a different file in the bucket, or use a different bucket. And that information is currently hardcoded in the definition when you make the call to the subworkflow. But if you follow some naming conventions for the project names and bucket names, that map to environments, this can work! (ie. PROD_bucket vs DEV_bucket, or PROD-env-info.json vs DEV-env-info.json)

Let’s wait for the support of environment variables in Workflows!

Originally published at https://glaforge.appspot.com.

--

--

--

A collection of technical articles and blogs published or curated by Google Cloud Developer Advocates. The views expressed are those of the authors and don't necessarily reflect those of Google.

Recommended from Medium

How to Format Your Unity Games for the Web

DSA,LLD & HLD

How to Select the Best Forex Prediction Software?

Announcement of UQC listing on Bibox Exchange with UQC/BTC and UQC/ETH Pairs.

An Exciting Transition at OpenAQ!

Azure Networking Options — Core Cloud Services

CS371p Spring 2021 Week 12

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Guillaume Laforge

Guillaume Laforge

Developer Advocate for Google Cloud Platform, Apache Groovy programming language project VP/Chair

More from Medium

Building APIs with Cloud Functions and API Gateway

Cloud Bigtable: Understanding Bigtables NoSQL 3-dimensional storage model using the Node.js client

Long-running Cloud Run functions

Deploy to Cloud Run from Cloud Deploy