The ASOS Tech Blog
Published in

The ASOS Tech Blog

Sending messages from Azure Data Factory to Service Bus

There is no built in Azure Data Factory (ADF) activity for service bus messaging. A simple option for sending messages to service bus queues and topics is to use the Service Bus REST API and ADF’s web activity.


In the Data Lake team at ASOS, we want to notify data consumers that new data is available as soon as our ADF data processing pipeline completes. Azure Service Bus topics are used to publish messages with details of new data available, so that consumers can subscribe to updates. When looking for ways to send messages to Service Bus from ADF we found suggested solutions using an Azure Functions Activity or a Logic Apps Activity, but this would require provisioning additional resources for the Azure Function or Logic App. We found that a simpler solution was to use the Web Activity with it’s built-in authentication handling.


We need to grant permission for the data factory to send messages to the Service Bus queue or topic. Assign the Azure Service Bus Data Sender role to the data factory’s managed identity. The simplest way to get started is to do this is in the Azure portal, following the Microsoft documentation. We have automated this role assignment as part of our deployment ARM template. Alternatively you can also make the role assignment using the Azure Management API, Azure CLI or the Az Powershell module.

Creating the Web Activity

Web Activity in the Azure Data Factory Studio


In the activity settings (in the web UI) or the type properties (ARM template), the method should be set to POST, and the URL will be formed of the namespace and queue or topic name as follows: https://{serviceNamespace}{queuePath|topicPath}/messages .


For authentication, select Managed Identity authentication from the advanced options in the web activity. Set the resource value as .

Broker Properties

Any values for broker properties, for example the message correlation ID, should be encoded as a JSON object and sent as the value for the request header BrokerProperties .

Custom Message Headers

Any request headers other than the reserved keys will be included as headers on your service bus message. The reserved keys are Authorization , BrokerProperties , Content-Type and x-ms-retrypolicy .


Set the appropriate Content-Type request header value to match your message payload. In the example below, the message is a JSON object, so the content type is application/json. It is likely that you will want to dynamically populate the message body with values from your pipeline, I would highly recommend reviewing the ADF expression syntax reference in order to make sure you get the syntax correct.

Example Activity Definition

The activity settings viewed in Azure Data Factory Studio

Example Output

Service Bus message viewed in the Azure Data Factory Studio


About The Author

Hugh is a senior integration engineer in the Data Lake team at ASOS. In his free time he enjoys white water kayaking and drinking coffee.




A collective effort from ASOS's Tech Team, driven and directed by our writers. Learn about our engineering, our culture, and anything else that's on our mind.

Recommended from Medium

Top 10 Flutter and React Native UI Libraries of 2020

How it Works: Our 6-Step Engineering Promotion Process

Switch easily between multiple kubernetes version on MacOs

Get LinkedIn Profile From A Domain Using An API

Getting the Most out of the New Monaca Cloud IDE with the integrated terminal

KuCoin Weekly Report #87–18/6/2020

The Mind of a Python Learner

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Hugh Woods

Hugh Woods

More from Medium

HTAP-Creating a Hybrid transactional/analytical processing data solution in Azure!

How to get started with Azure Event Hub to send and receive data using Python

Delta Lake Scanning with Azure Purview (and Apache Spark)

Streaming Data Exchange with Kafka and a Data Mesh in Motion