Notifications on Azure Machine Learning Pipelines with Logic Apps

Christian Schultz
Geek Culture
Published in
5 min readApr 28, 2021
Photo by Charles Deluvio on Unsplash

Microsoft Azure Machine Learning is a solid offering from Microsoft to improve organizations’ operations of Machine Learning. It is the Microsoft equivalent of Kubeflow, and allows easier governance, deployment and maintenance of Machine Learning workloads. In short, it makes it easier for an organization to properly employ ML Ops. However, the current incarnation of Azure Machine Learning is not very well documented, presumably as it is currently under intense development with frequent releases of new features. A particularly interesting new feature, that is currently in preview, is the integration between Azure Logic Apps and Azure Machine Learning. This short article demonstrates how to send a message to a Microsoft Teams channel if a pipeline run fails. You can find more information here: https://docs.microsoft.com/en-us/azure/machine-learning/how-to-use-event-grid

Designing a Logic App

Note that I had to manually enable “Microsoft.Web” and “Microsoft.EventGrid” under my subscription which is a MSDN subscription I use for testing. Your mileage may vary depending on you specific subscription configuration. You can enable resource providers under the “Resource providers” tab of your subscription details.

First, go to your workspace’s “Events” menu:

The “Events” menu. From here you will create a Logic App instance. For fine tuned control you can add an “Event subscription” from the button and manually set up an Event grid. For our purposes here, it is sufficient to let Azure fill in the blanks.

Click on the Logic App icon. A Logic App is a simple cloud based event handler that comes with a nice GUI. It will consume an event thrown by an Azure service, process it and trigger som action such as sending an email, uploading a file or similiar. In our case, the Logic App will catch an Azure Machine Learning event of the type Microsoft.MachineLearningServices.RunStatusChanged, parse the information provided in this event and send a message in Microsoft Teams. Clicking the Logic Apps icon from our ML workspace brings you to the Logic Apps Designer. First you have to define the message broker. Here we will use an Event grid. Fortunately, the Logic Apps Designer defaults to this, so there is no need to set one up ourselves. We simply sign in as seen in the screenshot below.

Connecting to an Event Broker. The Designer is clever enough to provision an Event grid for us, so there is little customization here. For fine tuned control, we would have to manually add an Event Subscription as mentioned above.

Doing so brings us to the important part: Designing the logic for how to treat the events sent by Azure Machine Learning. The first thing we have to do, is to configure what type of events we want to process. This can be seen in the screenshot below. We want to target the resource type Workspaces since we are doing Azure Machine Learning, and the event type RunStatusChanged since we want to trigger an action on a failed pipeline. You can find the “Resource Name” under the “Properties” panel in your Machine Learning Workspace — here it should be called “Resource ID”.

Designing the logic flow step 1: Defining the type of events to process.

Before we refine the logic further, it is a good idea to inspect a RunStatusChanged event. The JSON serialization of such an event can be seen below.

{
"headers": {
"Connection": "close",
"Accept-Encoding": "gzip,deflate",
"Host": "prod-47.northeurope.logic.azure.com",
"aeg-subscription-name": "XXX",
"aeg-delivery-count": "0",
"aeg-data-version": "2",
"aeg-metadata-version": "1",
"aeg-event-type": "Notification",
"Content-Length": "983",
"Content-Type": "application/json; charset=utf-8"
},
"body": {
"topic": "/subscriptions/XXX/resourceGroups/Logic_app_test/providers/Microsoft.MachineLearningServices/workspaces/my_workspace",
"eventType": "Microsoft.MachineLearningServices.RunStatusChanged",
"subject": "experiments/a4badd35-1c73-4322-822a-a097442cedd4/runs/2a7b95bb-9a26-442e-8de7-93f451c8019f",
"id": "1280a0fd-90dc-52d2-b26a-8baf5e268b26",
"data": {
"runStatus": "Completed",
"experimentId": "a4badd35-1c73-4322-822a-a097442cedd4",
"experimentName": "My-pipeline",
"runId": "2a7b95bb-9a26-442e-8de7-93f451c8019f",
"runType": "azureml.PipelineRun",
"runTags": {
"azureml.pipelineid": "47a6a7c1-3adb-4ad1-930f-b85d9e740ba7",
"azureml.pipelineComponent": "pipelinerun"
},
"runProperties": {
"azureml.runsource": "azureml.PipelineRun",
"runSource": "Unavailable",
"runType": "Schedule",
"azureml.parameters": "{}",
"azureml.pipelineid": "47a6a7c1-3adb-4ad1-930f-b85d9e740ba7"
}
},
"dataVersion": "2",
"metadataVersion": "1",
"eventTime": "2021-04-28T04:33:45.5390469Z"
}
}

The important part here is the body. The “data” field ind body contains the information we need. Specifically, we need to parse the event JSON, and trigger an action when three criteria are fulfilled: runStatus is Failed, experimentName is My-pipeline since we only want to trigger it on a specific pipeline, and runType is azureml.PipelineRun. If we didn’t have this last condition, our Logic App would be triggered twice since each individual step in a pipeline also emits an event. Note that the example provided above comes from a scheduled run as seen in runProperties field runType. We could thus further specify only to trigger on scheduled runs and not one-offs.

The next step is to choose a “Parse JSON” operation in the “Data Operations” category. We want to work directly on the data field, so we define this in the content field. We need to specify the expected schema - for convenience I have pasted it here:

{
"properties":{
"experimentId":{
"type":"string"
},
"experimentName":{
"type":"string"
},
"runId":{
"type":"string"
},
"runProperties":{
"properties":{
"_azureml.ComputeTargetType":{
"type":"string"
},
"azureml.moduleid":{
"type":"string"
},
"azureml.nodeid":{
"type":"string"
},
"azureml.pipelinerunid":{
"type":"string"
},
"azureml.runsource":{
"type":"string"
},
"computeTargetType":{
"type":"string"
},
"contentSnapshotId":{
"type":"string"
},
"stepType":{
"type":"string"
}
},
"type":"object"
},
"runStatus":{
"type":"string"
},
"runTags":{
"properties":{
"azureml.nodeid":{
"type":"string"
},
"azureml.pipeline":{
"type":"string"
},
"azureml.pipelineComponent":{
"type":"string"
},
"azureml.pipelinerunid":{
"type":"string"
}
},
"type":"object"
},
"runType":{
"type":"string"
}
},
"type":"object"
}

We paste the above JSON schema for the data field into the parse JSON operation as seen in the picture below:

Working on the data field in the JSON serialized event. We could also work on the body, and extract the data field from that, but why bother? Note that the name of the content may be “Event data” instead of “Data object” as shown in the picture.

For all future steps, since we specified the JSON schema we expect, the data in the data field is available as dynamic content. As our use-case is a rather simple if condition we use the Condition operation from the Control category. Additionally, we want to post a message from Microsoft Teams when the condition evaluates to True. Below is what it will look like:

Posting a Teams message when the 3 criteria are met. You will need to provide your login details to Teams. The message sent will be from the Flow bot, but your name will appear in the message text as well.

And that’s it! You could of course also send a Slack message or an email. Logic Apps comes with many integrations and is quite flexible and intuitive. You do not need to be an Azure expert due to the GUI provided in the designer.

--

--