Reading and Writing to Azure Log Analytics

Python functions for interacting with Azure Log Analytics API

Mustafa El-Hilo
Slalom Build
5 min readSep 29, 2020

--

Working in DevOps or SRE occasionally involves reading or writing event logs and tracking events over time. In some cases, these events can trigger an alert on either a system’s state or anomalies within it.

For example, one of my recent projects had a requirement to develop an application that tracked files and folders on an Azure Storage Account as the files were moved around. The subsequently triggered events had to be visualized and monitored. Azure Log Analytics is a perfect tool to use in this case, given that it provides Azure Workbooks and Shared Dashboards for visualizations, and creates alerts & notifications via Azure Monitor.

Based on that experience, I will walk us through a demo using Azure Log Analytics and Python so that you can get started with using it as well for your DevOps and SRE needs.

Getting Started with Azure Log Analytics

To get started, you will need to do the following:

  1. Create an Azure Log Analytic Workspace. I called mine LogAnalyticsDemoWorkspace.
  2. Create a Service Principal (SP) with Monitoring Reader RBAC (role-based access control) on your Azure Log Analytics workspace. My SP is called loganalyticsdemospfake. Make sure to take note of the appID and password when you create the SP.
  3. From your Azure Log Analytics Workspace, go to Advanced Settings and take note of the Workspace ID and Primary Key (see on the right under the black boxes).
Azure Log Analytics Advanced Settings

4. Given that the code provided below was tested against Python 3.7, you will need the following Python 3 modules installed as they are used in the code:

Writing Data to Azure Log Analytics

Let’s establish a use case to help explain the next few steps; our task is to record if a specific URL is reachable every x period of time.

To start, you can use the following Python code snippet to write into Azure Log Analytics using a POST method. The build_signature function is used to create an authorization header which is then used by the post_data function to authenticate and send a JSON payload into Azure Log Analytics.

Both these functions are adapted from Azure’s documentation to run using Python 3. For our use case, I’ll be writing this sample JSON payload into Azure Log Analytics every hour (or whenever I run the script):

The following Python snippet can be used to generate that payload in Log Analytics Workspace:

Run the script a few times after you replace url ,azure_log_customer_id and azure_log_shared_key with any url and your Workspace ID and Primary Key that you noted from Advanced Settings in your Workspace earlier.

It will take roughly 20 minutes for data to be generated in Log Analytics depending on the size of the data and how often you write it. To view the generated data, go to Logs in Azure Log Analytics and run the following Kusto Query:

You should notice a few things about the query and its results:

  1. Table names will always have _CL suffix.
  2. TimeGenerated is of type Date/Time and automatically added when the record is written into Azure Log Analytics.
  3. Take note of the data types in the payload; status is a boolean, url is a string, and rt_avg and response_code are decimals. In Azure Log Analytics each column data type is suffixed as _b for boolean, _s for string and_d for double. The column types are auto-generated during the first ingestion. A word of caution here — ensure your data types are consistent in order to avoid your data splitting into multiple columns inadvertently. For example if you send status as an integer value, a new column would be created called status_d .

Now that you have the data available you can generate different visualizations or set a custom Azure Monitor alert.

Photo by Joshua Aragon on Unsplash

Reading Data From Azure Log Analytics

Besides writing into Azure Log Analytics, what if you wanted to read that data in order to perform other logic in your code?

Reading from Azure Log Analytics via API is slightly different than writing as you can see in this Python code:

You will need to add in your own IDs and passwords and your Service Principal will need, at minimum, Monitor Reader RBAC on the Azure Log Analytics Workspace via IAM.

This should enable us to parse the returned data and perform additional actions on it.

Parting Thoughts

Now that we know how to read and write data to Azure Log Analytics via API, I hope that you will feel more comfortable incorporating Azure Log Analytics into your infrastructure, especially if you are using Python.

I personally found it challenging locating this information for Python 3 when getting started on my project, and I hope that reading this post saves you time in getting the most of your Azure Log Analytics quickly and efficiently.

Further Reading

Azure Log Analytics:

Kusto Query Language:

--

--