Reading and Writing to Azure Log Analytics
Python functions for interacting with Azure Log Analytics API
Working in DevOps or SRE occasionally involves reading or writing event logs and tracking events over time. In some cases, these events can trigger an alert on either a system’s state or anomalies within it.
For example, one of my recent projects had a requirement to develop an application that tracked files and folders on an Azure Storage Account as the files were moved around. The subsequently triggered events had to be visualized and monitored. Azure Log Analytics is a perfect tool to use in this case, given that it provides Azure Workbooks and Shared Dashboards for visualizations, and creates alerts & notifications via Azure Monitor.
Based on that experience, I will walk us through a demo using Azure Log Analytics and Python so that you can get started with using it as well for your DevOps and SRE needs.
Getting Started with Azure Log Analytics
To get started, you will need to do the following:
- Create an Azure Log Analytic Workspace. I called mine
LogAnalyticsDemoWorkspace
. - Create a Service Principal (SP) with
Monitoring Reader
RBAC (role-based access control) on your Azure Log Analytics workspace. My SP is calledloganalyticsdemospfake
. Make sure to take note of theappID
andpassword
when you create the SP. - From your Azure Log Analytics Workspace, go to Advanced Settings and take note of the
Workspace ID
andPrimary Key
(see on the right under the black boxes).
4. Given that the code provided below was tested against Python 3.7, you will need the following Python 3 modules installed as they are used in the code:
import requests
import hashlib
import hmac
import base64
import logging
import urllib3
import json
import datetime
Writing Data to Azure Log Analytics
Let’s establish a use case to help explain the next few steps; our task is to record if a specific URL is reachable every x period of time.
To start, you can use the following Python code snippet to write into Azure Log Analytics using a POST method. The build_signature
function is used to create an authorization header which is then used by the post_data
function to authenticate and send a JSON payload into Azure Log Analytics.
Both these functions are adapted from Azure’s documentation to run using Python 3. For our use case, I’ll be writing this sample JSON payload into Azure Log Analytics every hour (or whenever I run the script):
{
"status":"success",
"url","https://www.slalom.com",
"rt_avg":"7.125",
"response_code":"200"
}
The following Python snippet can be used to generate that payload in Log Analytics Workspace:
Run the script a few times after you replace url
,azure_log_customer_id
and azure_log_shared_key
with any url and your Workspace ID
and Primary Key
that you noted from Advanced Settings in your Workspace earlier.
It will take roughly 20 minutes for data to be generated in Log Analytics depending on the size of the data and how often you write it. To view the generated data, go to Logs in Azure Log Analytics and run the following Kusto Query:
demoURLMonitor_CL
| project TimeGenerated, status_b, url_s, rt_avg_d, response_code_d
| sort by TimeGenerated desc
You should notice a few things about the query and its results:
- Table names will always have
_CL
suffix. TimeGenerated
is of typeDate/Time
and automatically added when the record is written into Azure Log Analytics.- Take note of the data types in the payload;
status
is a boolean,url
is a string, andrt_avg
andresponse_code
are decimals. In Azure Log Analytics each column data type is suffixed as_b
for boolean,_s
for string and_d
for double. The column types are auto-generated during the first ingestion. A word of caution here — ensure your data types are consistent in order to avoid your data splitting into multiple columns inadvertently. For example if you sendstatus
as an integer value, a new column would be created calledstatus_d
.
Now that you have the data available you can generate different visualizations or set a custom Azure Monitor alert.
Reading Data From Azure Log Analytics
Besides writing into Azure Log Analytics, what if you wanted to read that data in order to perform other logic in your code?
Reading from Azure Log Analytics via API is slightly different than writing as you can see in this Python code:
You will need to add in your own IDs and passwords and your Service Principal will need, at minimum, Monitor Reader
RBAC on the Azure Log Analytics Workspace via IAM.
tenant = 'tenantID'
sp_id = 'service principal app ID'
sp_secret = 'service principal password'
azure_log_customer_id = 'Workspace ID'query = "demoURLMonitor_CL | project TimeGenerated, status_b, url_s, rt_avg_d, response_code_d | sort by TimeGenerated desc"sp_token = get_token(tenant, sp_id=sp_id, sp_secret=sp_secret)data = get_data(query=query,token=sp_token, azure_log_customer_id=azure_log_customer_id)print(data)
This should enable us to parse the returned data and perform additional actions on it.
Parting Thoughts
Now that we know how to read and write data to Azure Log Analytics via API, I hope that you will feel more comfortable incorporating Azure Log Analytics into your infrastructure, especially if you are using Python.
I personally found it challenging locating this information for Python 3 when getting started on my project, and I hope that reading this post saves you time in getting the most of your Azure Log Analytics quickly and efficiently.
Further Reading
Azure Log Analytics:
- https://docs.microsoft.com/en-us/azure/azure-monitor/log-query/get-started-portal
- https://docs.microsoft.com/en-us/azure/azure-monitor/learn/quick-create-workspace
- https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
Kusto Query Language: