Slack alert from Alicloud log service using Function compute

Rohit Tiwari
SCMP — Inside the Wonton
6 min readApr 6, 2020

In this blog, I will be showing all necessary steps to integrate Alicloud’s Log Service(SLS) log monitoring capability with Slack to create an alert notification using Function Compute.

Brief Introduction about the setup

In SCMP, we are using Alicloud’s Container Service for Kubernetes(ACK) for hosting our products. Prometheus and Grafana for monitoring the cluster and applications as well as Alicloud’s Logtail for forwarding logs to SLS for log monitoring.

Since Prometheus and Grafana are also running on the same Kubernetes cluster as our applications, we wanted a solution that offers additional monitoring of applications without using them.

As a solution, we came up with a logger program that runs under the K8s cluster and checks the application service reachability at a fixed interval using HTTP request and logs the response in JSON format.

Here’s how it looks like :

{
"level": "info",
"hostname": "production-app-tech-lift.product-web.svc.cluster.local",
"statuscode": 200,
"metric_value": 1,
"status": "200 OK",
"time": 1583915935,
"message": "Host is reachable"
}

We are using Alicloud’s Logtail for sending logs to SLS(Refer to the doc here) and used log search query to return metric value and use the same to create an alerting dashboard.

Let’s go further deep into SLS…

Under SLS, we created Project k8s-prod-log and added Logstore Kube-tools-cluster-logger to process and store the JSON logs.

Next, used Logtail configuration to parse JSON data using the below code.

{
"inputs": [
{
"detail": {
"IncludeLabel": {
"io.kubernetes.pod.namespace": "kube-tools",
"io.kubernetes.container.name": "logger"
},
"ExcludeLabel": {}
},
"type": "service_docker_stdout"
}
],
"processors": [
{
"detail": {
"KeepSource": true,
"NoMatchError": true,
"Anchors": [
{
"FieldType": "json",
"ExpondJson": true,
"FieldName": "log"
}
],
"SourceKey": "content",
"NoKeyError": true
},
"type": "processor_anchor"
}
]
}

Above code can be copied under Logtail configuration Plug-in Config as shown below.

Once Logstore is ready we can view the logs under Log Management Console as it starts parsing incoming logs.

Before we proceed with the Log search query, we need to add indexing for the required data by assigning a meaningful name to the variable that would be storing the value metric_value. Unindexed data will be stored in a single key named content .

For Indexing, we need to update Index Attributes for the required fields as shown below:

Now, we have to write the Log search query to return metric value. The logic behind metric value is simple, for response code equals to 200, metric value is set to 1, if the answer code is not 200, metric value is set to 0.

For example, if the host is not reachable, then the metric value is 0:

{
"level": "info",
"hostname": "production-app-tech-lift.product-web.svc.cluster.local",
"metric_value": 0,
"time": 1584087935,
"message": " Response is nil."
}

Consider metric value as a flag value i.e. if 1 (true) host is reachable, else not reachable. We will use this metric value to check for connectivity status and will create an alert based on the condition.

Log Query for metric value:

*|select log_metric_value as mv where log_hostname like 'production-app-tech-lift.product-web.svc.cluster.local'

Created chart by adding Log query under Graph, and add the chart to New Dashboard as shown below.

Chart added under dashboard:

In this example, I will create an alert notification for this chart.

I hope now you have a little background about our setup.

So without wasting any further ado, let’s get started…

Create a Slack App Webhook

  1. Visit https://api.slack.com/apps to create a new app at your apps page by clicking Create New App

2. The App is named “Alicloud Monitoring” in this example. Select a workspace and click “Create App”

3. Add Incoming Webhook feature to the app

4. Activate Incoming Webhooks by clicking the off switch

5. Click Add New Webbhook to Workspace

6. Select the Channel to post the message to.

7. Get the Webhook URL by clicking on the Copy button

Configuration of Slack is done. Copy the webhook URL to be used in the step below.

Create Function Compute Code

  1. Visit Function Compute on the Alicloud console.

2. Click Service-Function to create a new function

3. ClickCreate-Function and select HTTP Function as Function type and click Next

4. Now, we will configure the HTTP Function . In this example, we will use technology-slack-notidfication as service name, notification-callback-event as function name and nodejs6 as runtime.

5. Next, we will configure Trigger . In this example, we will name the trigger name as slack-notification-http , authorization as anonymous and the HTTP method as POST as shown below.

6. Now, we will add code under in-line Edit page of Service Function

7. Copy and Paste the below code

Note: Modify the path with Slack Webhook callback URL.

8. Once created, Trigger will expose the HTTP callback URL that we will use later for calling the function compute code.

9. Function compute code is now created, and the next step is to configure SLS to trigger the service.

Link up LogService with Function Compute

  1. Let us visit the SLS dashboard and create a new Alert for one of our sample dashboards under SLS.

2. Go to the dashboard, click Create Alert after clicking on the top right corner of the Chart.

3. Fill in Alert name, Chart name, Frequency, Trigger condition as shown below.

Note: Update trigger condition based on the metric value

4. On clicking Next, we will move under Notification tab to select Webhook-custom to trigger the function-compute

5. Add HTTP callback URL of the Function compute trigger under Request URL , POST for Request Method, application/json for Request Header content type and custom Request Content as below :

{
"message":"<Your Message>",
"alertState":"ALERT",
"title":"<ADD Dashboard link and Title>"
}

6. Click Submit

7. Finally, our alert is set up, and we can trigger test notification.

Happy monitoring!!!

Credits: Chris Ng, Gordon Tang

--

--