Using Azure Databricks to Analyze Application Logs from Log Analytics (Container Insight)

Eason
Geek Culture
Published in
5 min readJul 28, 2021

--

In past story, I shared about how to implement logging for Java Spring Boot Containerized App and feed it into Log Analytics (Container Insight). This story is part 2, I will share about how to perform logs analytics from Azure Databricks. Below is high-level architecture diagram of environment setup.

  1. Log Analytics (Container Insight) stored logs of various containers from AKS. And administrators or developers are able to perform log queries from workspace. This part of configurations has been shared and done in past story.
  2. Log Analytics workspace export log data into Blob Storage in hourly basis.

2.1 Azure Data Factory read source data from Log Analytics storage container (am-containerlog).

2.2 Azure Data Factory sink data into separated storage container (log-analytics) for advanced analytics purpose.

3. Azure Databricks connect to Blob Storage as data source and perform advanced analytics of log data.

Let’s kick start. Log Analytics workspace has newly capability to regularly or near-real-time export data into Blob Storage or Event Hub. In this story, I’m selected cold-path to export data from Log Analytics workspace to Blob Storage in hourly basis. Export…

--

--