Deploy Your Cloud Functions with Bluemix DevOps

Collaborate on GitHub, and automatically deploy when code hits master

--

Editor’s note: This article is for experienced Bluemix users who are collaborating with colleagues using GitHub. Some knowledge of DevOps services is assumed.

The serverless revolution is well and truly here, but the tools and practices around it are still maturing. Today I’ll share how I’m using a stage in my Bluemix Continuous Delivery Pipeline to handle deploying my IBM Cloud Functions actions when the repository where the actions live changes. This automation brings my Cloud Functions into line with the rest of my applications, by deploying them automatically and repeatably whenever changes make it into the master branch.

Prep

There’s some sample code you can grab from GitHub to try this example yourself. Fork the repo and clone it to your computer: https://github.com/ibm-watson-data-lab/deployable-cloud-function

To deploy this from my laptop, I use the Cloud Functions plugin for the bx command line tool (see the OpenWhisk CLI docs for more detailed setup instructions). To install the plugin:

Before I deploy the action I create a package to keep my actions in. This helps keeps things tidy as the number of actions grows.

To prepare for deployment the JS code is zipped up and used to create the action. For a single JavaScript file there is no need to zip it, but when creating applications with more than one file involved it’s needed so I’ve included this step from the start. The command here is to update the action, but IBM Cloud Functions will understand to create instead if the action doesn’t exist. This allows use of this pair of commands when first deploying the action, and also when updating it because the code has changed.

Check your action is working correctly by invoking it from the command line like this:

Hopefully you see some output including a response field with a result in it that is greeting you at this point. If not, I'd recommend checking the steps above until you see a healthy outcome.

Automate deployment

The flaw in this plan is that deploying from local machines can be difficult to repeat, or for other people in the team to repeat. If an access token or some database credentials, for example, are required, it’s bad form to share them by adding the strings to the source control system. Instead, those values can be configured in a continuous deployment tool that’s set to deploy using these values when new code is pushed to the repo.

Set up the toolchain and repository integration

From the Bluemix Dashboard, choose DevOps from the left hand menu and then choose Create A Toolchain. Look for the Build Your Own Toolchain option at the bottom, then give the new toolchain a name and click “Create”. There are a few pieces needed here, so I’ll show you each one in turn.

First choose Add a Tool. Since the idea is to deploy when the code is pushed to git, the first thing to add is a GitHub integration. (There is support for git repositories in other locations. Adjust as appropriate.) This will prompt you to authorise a GitHub account, and then to give the URL of the repo you cloned earlier. Once this is done, an entry for your GitHub repo (and another for its issues, if you ticked that box) will appear in the toolchain, and we’ll then add the deployment piece that will respond to changes on our GitHub repo.

The GitHub tool integration was added to the Delivery Pipeline, monitoring the repository for changes.

Configure the deployment

Next, choose Add a Tool again to put a new entry into the pipeline, and then choose Delivery Pipeline for the tool to add.

Finding the Delivery Pipeline tool integration.

As with the other components, give this new tool a name (mine is called “Pipeline” because naming things is hard) and create it.

Once the delivery pipeline tool is in place, click on it and then go ahead and “Add Stage”. The three tabs at the top — input, jobs, and environment properties — are how the deployment activity itself is set up. Here’s a quick overview:

  • Input: Simply contains which repo and branch to use, and whether to run the deployment manually. These default settings are usually correct, but change them as appropriate for your use case.
  • Environment properties: Variables we can use in our setup scripts. It’s possible to add both plain text and “secret” fields (displayed as asterisks, like a password field). Set the values you will need here, such as database credentials, access tokens, and so on. This must include a Bluemix API key, which can be generated by doing bx iam api-key-create keyname. Mine is in a variable called (very originally) APIKEY.
  • Jobs: Where the real work gets done. We’ll use a single job for this example. Choose ADD JOB, job type Deploy and then keep the default Cloud Foundry type. Check that the account, organization, and space information looks correct, then we’ll get to the interesting bit: the deploy script.

The basic idea is that we need to grab the Cloud Functions plugin for the bx command (find out more about the Bluemix Commandline Tool in the docs), then log in with the API key we configured earlier and target the desired Bluemix organization and space. My script looks like this:

After adding the Cloud Functions plugin to the Bluemix Commandline Tool, this script logs us in using the API key we created when configuring the deployment tool. Using the built in /whisk.system/utils/echo action will show us some output in the logs if everything is configured correctly to work with Cloud Functions, or cause a (hopefully helpful and informative) error if that's not the case. The action update command does the actual deployment, taking the newly zipped file and deploying it as an action. The final call to action list simply shows us that the action is there as expected.

Check that everything works as expected by pressing the green “play” button on this task. If it does, you’re all set!

The delivery pipeline deploys the updated Cloud Function every time a change is pushed to the GitHub repository.

Deploy on git push

This should be the easy part if everything worked before when the play button was pressed. Test that the “wiring” to GitHub is working by making a change (on this one occasion, it is acceptable to commit straight to master!) in the repository. Then watch to see if the stage runs to push the new code live.

And there you have it — automated deployment of your IBM Cloud Functions when you push to their repository!

--

--

Lorna Mitchell
Center for Open Source Data and AI Technologies

Polyglot programmer, technology addict, open source fanatic and incurable blogger (see http://lornajane.net)