Encoding Video with .NET Core and Azure Media Services — Part 3

Jean-Marc Skopek
Jean-Marc’s Thoughts
2 min readMay 8, 2018

Part 3: Running an Azure Media Services Job

In Part 2, we were able to upload a video file to Azure Media Services. Now that an Asset has been created and a file has been uploaded to it, let’s run an encoding job on it.

Before we can create an encoding job, we must request an Azure Media Services media processor. AMS has many media processors, each of which can do different tasks. Some of these include:

  • Video thumbnail generator
  • Optical character recognition processor
  • Face redactor/blurring
  • Motion detection processor
  • Face detector
  • Hyperlapse time-lapse encoder
  • Transcript/subtitles generator

The Azure Media Services media processors are described in more detail in an article from Microsoft.

Request a Media Processor

In the MediaServices class, create a method that retrieves the ID of a specified media processor:

Create a Job

In the MediaServices class, create a method that generates an encoding job:

Running a video encoder job on an asset

Now that we have the ability to request a media processor and generate a job, let’s run an encoding job on our uploaded video file from the main Program.cs file:

Congratulations! You should now have a video encoding job running on your newly uploaded asset.

Your code should now look like this.

You now have everything you need to upload video files to Azure Media Services and run a number of different encoding jobs.

--

--