Executing bash scripts with a webhook in Google Cloud

Anshu Rao
Google Cloud - Community
4 min readApr 27, 2020

I recently came across an interesting question about automation in GCP: How can I trigger a bash script off a webhook? It’s a seemingly common automation challenge, but the solution wasn’t completely obvious.

This gets especially tricky when you have a more complex bash script with certain binary dependencies. A really good example of this problem is running a script to scan your Google Cloud organization for publicly available functions, a challenge that Thomas Ruble tackled in his Medium article. It’s an interesting read, check it out if you haven’t already!

Note: This bash script is from Thomas Ruble’s Medium article on scanning public cloud functions.

Now what if I wanted to take Thomas’ scripted solution and trigger its execution using a webhook? Essentially the end-to-end solution would be able to send an HTTP request with the necessary script arguments in the payload, execute the script, and return its results in a json response. This will open the door for more automation, service-to-service integrations, and pipeline creation.

Now that we better understand the challenge we’re aiming to solve, let’s jump right in!

Can we run bash scripts in Cloud Functions?

At first glance, Google Cloud Functions seems like the perfect fit for this use case. If you’re not familiar, Google Cloud Functions is a lightweight, serverless platform for event-driven functions with a fully-managed runtime environment in which your code is executed. We can setup a Cloud Function to respond to an HTTP trigger to run our code. Currently Cloud Functions supports the following runtimes:

  • Node.js
  • Python
  • Go

It seems like we could just hand our script to a Cloud Function and we’d be good to go, but as you can see bash is not an available runtime environment. The next logical step seems to be to open a subprocess in a Cloud Function in order to execute our bash commands. Although this approach would work, the downside is that the system packages available in the runtime environment are limited by design, so we are unable to execute any bash commands using custom binaries.

What about Cloud Build?

Google Cloud Build is a continuous build, test, and deploy service that executes on Google infrastructure. With Cloud Build, you can define custom build steps in order to execute your build, with each step taking place in a docker container. By writing your own build step, you can package your dependencies in a single container and run your shell script. The Cloud Build documentation has a good example of this use case.

But what about the webhook? Although it’s possible to invoke a Cloud Build pipeline with a call to the Cloud Build API, it’s not a good fit for our use case. Cloud Build is clunky for running a one-off script since it’s semantics are tied to source code and artifact production. Also, using the API call doesn’t do well to pass parameters to our bash script. In any case, Cloud Build isn’t the best answer to our problems. This is where Cloud Run comes in and saves the day!

Cloud Run

Google’s Cloud Run service is another one of Google Cloud’s serverless offerings, and is the best fit solution for our challenge. Cloud Run is a fully-managed compute platform that runs your stateless containers that speak HTTP. Simply provide Cloud Run with your container and Cloud Run will listen for requests or events on port 8080. Read up on Cloud Run concepts and tutorials in the Cloud Run documentation.

Bash scripts in Cloud Run

Creating a web server to listen to HTTP requests

Our first step is to set up a simple web server to listen to the HTTP requests we receive on port 8080. Since our application will be containerized, we can write our server in our favorite language. In this example, I’ve elected to create a lightweight Python server using Flask. Using Flask, we can create a route for our application to handle HTTP POST methods.

Next, we can add logic in the method we just created to parse and clean the request body for any arguments our bash script requires. We can set these arguments as container environment variables using os.environ which our bash script can then use.

Note: the Flask server must listen on the PORT environment variable as specified by the container contract.

Containerizing and Deploying the Application

Now that we’ve written our web server which executes the bash script, our next step is to containerize our application. Remember that Cloud Run manages stateless containers, so we need to create a Dockerfile with instructions on how the container should be built. Here we can include the binaries we need in the container image so that it’s available to the bash script. Since Cloud Run accepts container images as the deployment unit, we can add any executables or system libraries to the image and use it in our application. In other words, we can package all of our dependencies for our bash script and ship it off to Cloud Run - in this case, we’ll include the Google Cloud SDK in our environment.

Now that we’ve containerized our application, we can build our container, push it to GCR, and deploy it to Cloud Run. We’ll also configure the service to prevent unauthenticated access, so that we can control who has access to invoke the service using IAM policies. These two gcloud commands will take care of these steps for us:

gcloud builds submit --tag gcr.io/$PROJECT_ID/$IMAGE
gcloud run deploy --image gcr.io/$PROJECT_ID/$IMAGE --platform managed --no-allow-unauthenticated --region us-central1 ${SERVICE_NAME}

That about wraps it up! We’ve now successfully setup a webhook to execute our function scanner.

--

--