PagerDuty + Slack + Azure Integration
An illustrated walk through of how to correctly set up the integration between these three services.
Our tech ecosystem involves PagerDuty for alerting, Slack for communicating, and Azure for logs. We wanted to integrate all of these so we could have clear, transparent alerting for our various apps that we host in Azure.
We thought we had this integration all set up, but we were observing a strange anomaly. Slack would not notify us if the Azure alert had been triggered. It would only notify us when the alert had been acknowledged, resolved, etc. Furthermore when we tried to view the specific alert in PagerDuty, the whole site would freeze or crash.
We finally figured out the cause of these issues: the large JSON payload associated with the alert from Azure.
note: this occurs only with Azure Alerts that are configured using the “Log Search” Signal Type.
To restate the problems we were seeing:
- Slack would not notify us of Triggered alerts
- The PagerDuty app would crash regularly while trying to view the alerts
To resolve these issues, I found that we needed to configure a Custom JSON payload for our Azure Webhooks, which are used by PagerDuty. To do that, I went into the configuration for our Azure alerts.
Azure Configuration
As you can see below, our PagerDuty Webhook was already set up, and the condition is a Custom Log Search. This following steps are only required if the alert is a Custom Log Search type.
Then to actually customize the JSON payload, I checked the box labeled “Include custom JSON payload for webhook”. To specify the JSON output, I referred to the Azure documentation on how to use predefined variables in the text of the JSON. You can use variables such as #searchresultcount
to include details in the text about this specific alert.
To ensure that my custom JSON looked correct, I clicked the “View Sample Webhook Payload” button.
After configuring my alert in Azure, I tested my alert in Slack:
But wait… there’s a new problem ! The summary is missing from the Slack alert.
PagerDuty Configuration
To configure the summary of an alert in Slack, I had to go into PagerDuty to configure my service to read from the correct JSON fields. So I went to the Event Rules tab of my service:
Inside the Event Rules, I configured a new rule with the following criteria:
In the following screenshots, I show more detail on what values need to be configured to populate the Slack alert’s summary. I needed to create a rule that fired only if the text
field of Custom Details
existed. (text
should be replaced with whatever field in the JSON object you want to extract Summary from).
Then I clicked “Do these things” and went to the Advanced tab. I checked the box “Add or replace the summary field” and filled it in as shown below. This specifies that the alert should replace the Summary field with the custom field named text
.
I then tested my alert one more time — success! The alert shows up as Triggered in Slack, with the summary that I configured in Azure:
One last thing: if you don’t see the Event Rules tab in your PagerDuty Service configuration, you may need to edit the service so that it can create alerts AND incidents. The Event Rule tab will not show up if the service is configured differently.
Hopefully this can help anyone facing a similar issue to correctly configure their alerts. Happy configuring!