Lean Dashboards — the Serverless Way

How to build serverless dashboards using Azure Application Insights, Logic Apps and Google Data Studio

Benny Bauer
A Cloud Guru
5 min readNov 19, 2017

--

Wescover is an early stage startup — so it’s essential we measure our growth & execution and continuously improve ourselves. These metrics should be analyzed and projected to the whole company in an accessible dashboard.

The Analytics Pipeline

A simple representation of a data analytics pipeline will look like this:

Analytics Pipeline
  1. Report metrics from the Application
  2. Store them in a scalable data store
  3. Analyze with a BI tool
  4. Visualize on a dashboard

As an early stage startup, we should make sure we are efficient — putting in the minimal effort to provide a requested functionality. This effort should preferably form the foundations for a further expansion, but it should not be a bottomless endeavor of building the best solution.

With this in mind we set to build a dashboard in a lean manner. As a Serverless fanboy, my approach is to first see what services can be reused and “glued” before reinventing the wheel myself. This is the account of how we implemented a simple analytics pipeline in Wescover.

Steps 1 & 2: Report and Store

Wescover runs on Azure cloud services, so it was fairly easy to use the Application Insights service for these steps. We’ve used the Application Insights SDK to report custom metrics and these were stored in this managed service without additional hassle.

One thing to notice though, Application Insights’ data retention period is limited (90 days). This is good enough when the analytics are digested based on the last weeks metrics. If a longer period is needed then you can configure a continuous export to Azure’s storage service and build an ETL pipeline.

Step 3: Analyze

Next we had to analyze the data. Luckily, Application Insights supports a powerful SQL-like query language you can run from their Analytics web tool or via REST API.

Its rich syntax enables the execution of complex queries. Below is an example of such a query. The query fetches the weekly totals for each custom event, aggregated by the last 2 weeks:

Don’t get scared, understanding the powerful features of the language is worth every minute 🔨

The result would look something like this:

Query results example (numbers ain’t real, yet 😜 )

Step 4: Visualize

Now that we were able to query our data and analyze it, we’d like to display it on a nice dashboard for the whole team to see. We found that Google’s Data Studio is a simple yet sufficient solution for our needs. It has good diagrams variety, it’s simple to use and customize, and provides Google’s G Suite access control out-of-the-box.

Oh Oh…

Sounds great, but we had one (major) problem: Data Studio cannot pull data from Azure’s Application Insights. It connects to various Google services (Big Query, Cloud SQL, Sheets, Analytics, DCM, etc.) and provides community connectors to third parties (Facebook Insights, Twitter and more) but not to Azure. Surprise surprise.

How To Bridge This Gap?

As mentioned above, Data Studio can connect to Google Sheets as the data source. If only we could’ve push our analytics there 🤔

Easy! We can use a serverless scheduled job to query the analytics from Application Insights and push them into the Google Sheet! Azure Functions is a good candidate for this, but for this scenario Azure Logic Apps is a perfect match. So We’ve built a Logic App with a Recurrence trigger type to run every hour and do the following:

  1. Query Application Insights via an Application Insights connector
  2. Put the results into a Google Sheet via a Google Sheet connector
Analytics Logic App

As simple as that. No servers were spun, no code was written and no 3rd party APIs had to be investigated.

BTW, remember the data retention limitation I mentioned before? Once we’ve put the query results into the Google Sheet they will remain there forever. And the dashboard will display the analytics history not only of the last 90 days, but since the very beginning.

TL;DR

We’ve built a full analytics pipeline with not much more than “glueing” between managed services — and we did it quickly.

Our lean analytics pipeline

The Wescover application reports custom metrics into the Application Insights store. A Logic App analyzes the data with an Analytics query and puts the results in a Google Sheet. The Google Data Studio displays the analytics in a beautiful dashboard. 🤓

Bonus: Logic Apps Tips & Tricks

A few neat tips for Logic Apps:

  • It’s useful to create an object for the query with the Compose action. Makes it easier to maintain and to refer it from later actions in the flow.
  • Use Scopes to group actions and separate them from other logical parts. It forms a more organized app and comes handy when you wish to have a particular error handling for specific parts of it. Actually our real logic app runs several scopes every time it’s triggered, each one has a different query:
Executing multiple queries in parralel
  • Logic App is in fact a JSON configuration, so you can store it in your repository and deploy via Azure’s cli.

Cheers — hope you found this handy! 👏

--

--