Using Logic Apps to orchestrate a complex video processing process flow

Logic Apps is the (Azure residing) brother/sister to Power Automate (as part of the Power Platform) and what it’s for in a nutshell is running orchestration processes using a visual view. It’s perfectly in line with the idea of citizen development, the concept of non-developers being empowered to create software. Including what usually used to be scripts of some sort, because what you build with Logic Apps (or Power Automate) could have been as well a bash script or something along those lines.

However without all the integrated concepts, like Managed Identities or predefined connectors which reach out way beyond the Microsoft ecosystem. (In fact, with hundreds of connectors available most popular services should be covered.) And less easy to digest and adapt, now that however might be a personal choice and habit dependent statement.

That’s enough preface to what Logic Apps (LA) is or why it’s interesting to look into, let me show you how to combine this low-code tool with pro-code (=all code) in order to achieve an end-to-end flow which is demanding for sure. Knowing a little bit about Azure Services is certainly useful but you’ll get the gist, even if you do not know the mentioned services.

What the LA process flow does in a nutshell:

I hope you appreciate the fact I included Non-Microsoft services like Dropbox instead of OneDrive and Slack instead of Teams to underpin that Logic Apps is very well integrated into the Microsoft Cloud — however not at all limited to it.

Let’s get started with creating the trigger and some variables.

Warming up before we get busy

In the next step, we acquire a valid bearer token (=access token) for accessing the Dropbox API. (Logic Apps has a connector for Dropbox, the service we are about to use next however does not so we go straight to the API.) — How the Dropbox API works is a tad too much for the scope of this walk-through, so please read it up on the Dropbox Developer Portal. What I did was use my old pal Postman to study the API and carried over all the needed headers and payload etc. to the Logic Apps HTTP Action.

You have to parse the output of that call, and the easiest way to generate a schema for that is taking advantage of the “use sample payload to generate”. Postman will give you all the details you need, a copy & paste job really.

And now we are fine to start an Azure DataFactory pipeline job, like this.

Read up how exactly this works in this article. (As well how to overcome the content-type setting mentioned in the action step comment.)

In an “Until…” loop we’ll wait for the DataFactory pipeline job to finish, by periodically checking the job status. Leaving the loop as soon as the expected values for the job state are reached.

Now we can delete the Dropbox file (assuming here it is not needed on Dropbox anymore) and create a SAS URI. It will come in handy shortly.

From here, the flow branches of in two parallel streams.

In one (on the right side) a public URL is composed based on the fact the file is written into a public Blobstorage container. That URL then gets posted on Slack — and is ready for sharing with each and everybody. (As mentioned earlier, this step is optional. It just goes to show how easy to “publish” your initial file on Dropbox can be out of LogicApps. The file could sit in a private container as well for your eyes only, forever.)

On the left side of the parallel branch we check the file for whether it’s a MP4 or not. Imagine a scenario where you upload MOV files or whatever from your phone, for example. If it’s not MP4, a transcoding job is kicked off using an Azure Function. The conversion engine used is Azure Media Services, a highly specialized service for doing things like this and a lot more complex ones on top.

Using a 100% code piece embedded into the low-code Logic App flow is easy and straight-forward, in fact all you do is select the Function you want and that’s that. My function returns JSON, so that needs to be parsed — and that is about it, for the rest of the flow I can work with the returned asset and container names. (Asset is the term for videos in Media Services, container refers to Blobstorage containers.)

The Function code and how it works, what it does — read it all up here. (It is using almost the same code as the terminal tool depicted, with small adjustments here and there.)

From there, we compose the URI to the actual Blobstorage container and desired asset (in MP4 format). (FYI: Media Services generates a container for every asset and the files in that container follow a certain naming convention which depends on things like output quality and format and so on.)

Afterwards a SAS URI is generated pointing exactly to the MP4 conversion and posted to Slack. Since the expressions are not trivial for “Compose 3” and “Compose 4” in that screenshot above, let me outline what they contain.

“Compose 3":


“Compose 4”:


And that concludes the walk-through! The highly anticipated Slack message would look something like this:

I know this whole flow and use case is implemented in a slightly over-engineered way*, however I assume you want to lay hands on some exciting services and tinker for learning’s sake — and that’s what you are getting here. I am sure you could think of much cooler, worthwhile use cases that might be implemented using the services mentioned. DataFactory, Azure Media Services, Azure Functions, Storage… Well, if you can imagine it, you can build it as well, combining pro-code and low-code in perfect harmony.

Have fun playing with Logic Apps or the Power Platform. What was the most complex thing you built with Logic Apps or Power Automate? Would you do it again? 🤣 Let me know.

*) In fact, you can convert the video straight out of the Dropbox web view. Not the mobile app, though. (As of November 2022 that is.)



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store