Automation using Azure Logic Apps

Abdulkhader Sakivelu
DataPebbles
Published in
7 min readDec 8, 2023

Introduction

This blog is written with the intention of showing the simplicity of Logic apps in Azure cloud.

“Azure Logic Apps is a cloud platform where you can create and run automated workflows with little to no code” is the official definition from Microsoft and it’s quite true. It is a powerful tool with a wide range of inbuilt functions and a simple-to-use visual designer. There are other tools and technologies that offer similar functionalities but if your organization is invested in Azure cloud then this might be the best choice due to its simplicity and also the integration with Azure DevOps.

Logic Apps can be used to create workflows by linking together various services with no code by using inbuilt functions and automating various scenarios like listed below

  • Schedule and send email notifications using Office 365 when a specific event happens, for example, a new file is uploaded.
  • Route and process customer orders across on-premises systems and cloud services.
  • Move uploaded files from an SFTP or FTP server to Azure Storage.
  • Monitor tweets, analyze the sentiment, and create alerts or tasks for items that need review.
source: Microsoft

Business use case

Consider a scenario where you are collecting data from an auditor or a sales agent or any end user for processing. Most of the time it can be someone who isn’t a tech savvy individual and prefers to send the data over an email. This data is again uploaded by a designated person into the system.

A process like this is prone to human error, delays and can be easily automated using Logic apps. All you would have to do is create a Microsoft form (since we are talking Azure) and configure a workflow in Logic apps to process each response, collect the data, move it to an Azure blob container and kick of your data processing pipeline. While it may sound simple but a small automation can go a long way.

Let's walk through the process of creating such a workflow.

Create a Microsoft form

I have created a simple form that takes two inputs.

A simple Microsoft form

The goal is to move the data that is collected from the form to a container. in Azure storage account. You can add multiple options to the form and use the inputs to create complex workflows but that is not our objective.

Create a workflow

Navigate to Logic Apps in the Azure portal and create a new workflow. I have opted for ‘Consumption’ for plan type as it is sufficient for the present use case.

Once the deployment is completed, go the resource page of the workflow and edit the workflow. The workflow editor has two versions (Generally available designer, Preview designer). If you are redirected to the Generally available designer, then choose option ‘Blank Logic App’. If you ended up in the Preview designer of the editor then switch to the old one by selecting the option ( I will be using the preview designer mostly due to personal preference) .

Link the Microsoft form

Create a parameter for the unique identifier of your Microsoft form (you can find it in the http address of the form).

Note: Parameter creation only works in Generally available designer (at the time of writing this blog).

Please switch to the ‘Preview designer’ to follow along. However the below steps should be the same in either versions with different UI.

We are going to add a trigger which runs every time a new submission is made to the form. You can search for the MS forms trigger and use the parameter you created for the form ID by after opting to enter a custom value.

While working with the designer you have two options for filling the fields. Writing your custom function to get the details required or using dynamic content(which writes the functions for you automatically).

Note: For connecting to the MS form I had previously configured an API connection and authenticated it using an account that has access to the form.

Collect response

Now add the next step “Get response details” which collects the information regarding the response submitted from the trigger. I am using the below function to get the ‘responseId’ for each execution.

To make our life easier for the next part, let's save the workflow and make a submission to the form. The workflow gets triggered and our steps should run. From the resource page of the workflow, go to the run details and copy the raw output of the second step onto a notepad.

Format the response details

Go back to the editor and add the next step, parsing the output of the second step into JSON format. Depending on the number of files that can be uploaded into the form an array of JSON objects will be created for each file. This information can be seen in the raw output you copied.

The input for the task ‘Parse JSON’ can be any of the previous steps output or a particular element of the output. In this scenario, we will use the output of ‘upload data’ option from the form. You can choose the same from the automatic suggestions or you can opt to write your own code to get the data using ‘body’ function. In the schema box paste the JSON array from the raw output with details about the files and the designer will generate the schema.

Traverse through the response

You have all the details needed in the form of a array of JSON objects and you can iterate through the array using a for-each loop. You can choose the type of looping construct by searching for ‘control’. The input for the loop is the output of the ‘Parse JSON’ task.

Fetch attachments

To fetch the file uploaded from the SharePoint location associated with the Microsoft form, search for the operation ‘Get file content using path’.

The ‘site address ‘ can be the address for the home page of your SharePoint and for the path of the file, we will use the ‘items’ function to get the link from the current iteration.

Write file to a container

Next task is to write the data to a container as a blob. You can use ‘Create block blob (V2)’ operation to do the same.

For the name of the blob, we can use its original name as below or create a new name using the inbuilt string manipulation functions. The content of the blob will be the file fetched from the previous step (You can choose it from the suggestions).

Note: The API connection being used to connect to the storage account should be authorized with the account key.

Notification

Finishing it up by adding a task to notify via email every time a submission is made and the blob is created. You should have access to the mailbox address you will be using to send the email. For experimentation, you can use your personal Outlook mail box.

The contents of the email can also be configured to get the details from the particular run by using the ‘Dynamic content’ easily.

Finished workflow

Now we have a workflow that looks like this and is ready to run every time a submission is made to the form.

Workflow

Conclusion

This is just a small example to demonstrate the functionalities of Logic apps, however even in production scenarios the above design will be more or less the same with few added steps.

Logic apps have a wide range of operators which can be used to link various services together. For example, you can trigger a Databricks notebook, a Machine Learning pipeline that processes the file uploaded above, etc., Another advantage is we can export the entire workflow as an ARM template and use it in our release pipelines by parameterizing the code.

You can find information on how to set up a Azure DevOps pipeline here

Did you find this article helpful? Clap, share, and if you have any questions or suggestions about this article, please contact me at abdulkhader.sakivelu@datapebbles.com .

--

--