Reading and Writing to Azure Log Analytics

Python functions for interacting with Azure Log Analytics API

Mustafa El-Hilo
Sep 29, 2020 · 5 min read

Working in DevOps or SRE occasionally involves reading or writing event logs and tracking events over time. In some cases, these events can trigger an alert on either a system’s state or anomalies within it.

For example, one of my recent projects had a requirement to develop an application that tracked files and folders on an Azure Storage Account as the files were moved around. The subsequently triggered events had to be visualized and monitored. Azure Log Analytics is a perfect tool to use in this case, given that it provides Azure Workbooks and Shared Dashboards for visualizations, and creates alerts & notifications via Azure Monitor.

Based on that experience, I will walk us through a demo using Azure Log Analytics and Python so that you can get started with using it as well for your DevOps and SRE needs.

Getting Started with Azure Log Analytics

To get started, you will need to do the following:

  1. Create an Azure Log Analytic Workspace. I called mine LogAnalyticsDemoWorkspace.
Azure Log Analytics Advanced Settings

4. Given that the code provided below was tested against Python 3.7, you will need the following Python 3 modules installed as they are used in the code:

import requests
import hashlib
import hmac
import base64
import logging
import urllib3
import json
import datetime

Writing Data to Azure Log Analytics

Let’s establish a use case to help explain the next few steps; our task is to record if a specific URL is reachable every x period of time.

To start, you can use the following Python code snippet to write into Azure Log Analytics using a POST method. The build_signature function is used to create an authorization header which is then used by the post_data function to authenticate and send a JSON payload into Azure Log Analytics.

Both these functions are adapted from Azure’s documentation to run using Python 3. For our use case, I’ll be writing this sample JSON payload into Azure Log Analytics every hour (or whenever I run the script):

{
"status":"success",
"url","https://www.slalom.com",
"rt_avg":"7.125",
"response_code":"200"
}

The following Python snippet can be used to generate that payload in Log Analytics Workspace:

Run the script a few times after you replace url ,azure_log_customer_id and azure_log_shared_key with any url and your Workspace ID and Primary Key that you noted from Advanced Settings in your Workspace earlier.

It will take roughly 20 minutes for data to be generated in Log Analytics depending on the size of the data and how often you write it. To view the generated data, go to Logs in Azure Log Analytics and run the following Kusto Query:

demoURLMonitor_CL
| project TimeGenerated, status_b, url_s, rt_avg_d, response_code_d
| sort by TimeGenerated desc

You should notice a few things about the query and its results:

  1. Table names will always have _CL suffix.

Now that you have the data available you can generate different visualizations or set a custom Azure Monitor alert.

Photo by Joshua Aragon on Unsplash

Reading Data From Azure Log Analytics

Besides writing into Azure Log Analytics, what if you wanted to read that data in order to perform other logic in your code?

Reading from Azure Log Analytics via API is slightly different than writing as you can see in this Python code:

You will need to add in your own IDs and passwords and your Service Principal will need, at minimum, Monitor Reader RBAC on the Azure Log Analytics Workspace via IAM.

tenant = 'tenantID'
sp_id = 'service principal app ID'
sp_secret = 'service principal password'
azure_log_customer_id = 'Workspace ID'
query = "demoURLMonitor_CL | project TimeGenerated, status_b, url_s, rt_avg_d, response_code_d | sort by TimeGenerated desc"sp_token = get_token(tenant, sp_id=sp_id, sp_secret=sp_secret)data = get_data(query=query,token=sp_token, azure_log_customer_id=azure_log_customer_id)print(data)

This should enable us to parse the returned data and perform additional actions on it.

Parting Thoughts

Now that we know how to read and write data to Azure Log Analytics via API, I hope that you will feel more comfortable incorporating Azure Log Analytics into your infrastructure, especially if you are using Python.

I personally found it challenging locating this information for Python 3 when getting started on my project, and I hope that reading this post saves you time in getting the most of your Azure Log Analytics quickly and efficiently.

Further Reading

Azure Log Analytics:

Kusto Query Language:

Slalom Build

The Build Blog is a collection of perspectives and…

Thanks to Augusto Kiniama Rosa and Zahid Zaman

Mustafa El-Hilo

Written by

Senior Engineer — Cloud, DevOps and Security (CDS) — Slalom_Build

Slalom Build

The Build Blog is a collection of perspectives and viewpoints on the craft of building digital products today, written by the technologists that build them. By builders, for builders.

Mustafa El-Hilo

Written by

Senior Engineer — Cloud, DevOps and Security (CDS) — Slalom_Build

Slalom Build

The Build Blog is a collection of perspectives and viewpoints on the craft of building digital products today, written by the technologists that build them. By builders, for builders.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store