Streamlining Azure Integration Services via Microsoft Entra ID using Terraform

Empowering Your Azure Ecosystem with Advanced methods using Microsoft Entra/Azure AD — Discussed Real Time use cases

DevOpsWithYoge
8 min readMay 1, 2024

This Blogs gives a wider perspective on how to use Microsoft Entra ID’s App registration — service principal — az ad groups and managed identity in Azure with terraform code samples.

By this approach you can eliminate the usual connection string connectivity between resources and can achieve a smooth and secured connectivity between the Azure Integration Service Resources like Function apps — logic apps — APIM — key vault — Storage accounts etc.

Context

  • Basics of Microsoft Entra ID Components
  • Effective Infrastructure Deployment Pattern
  • Terraform Code Explanation
  • Use Case Scenarios

Basics of Microsoft Entra ID Components:

if you are interested in how Microsoft Entra Auth works you can refer to this blog -> https://medium.com/@devopswithyoge/microsoft-entra-id-modern-authentication-how-it-works-aa20c8e9649f

Infrastructure Pattern for Microsoft Entra ID Set up with AIS

Fig 1.0 Overall Architecture Diagram for Microsoft Entra Pattern

In the above architecture diagram you can find that the Logic apps — Function apps are internally connected to apim and has access to keyvault and storage account by leveraging the use of effective Mircosoft Entra ID components like app registrations (custom roles for apim), and azure ad groups. I will explain it step by step;

For clarity of explanation I have used the repetitive azuread provider blocks , will be hosting standardized version (modularized version in my github soon ).

checkout easy terraform module creation : https://medium.com/@devopswithyoge/terraform-modules-how-to-use-best-out-of-it-with-real-time-usage-scenarios-bc66a8fde9b7

Terraform Code Explanation

Microsoft Entra ID — App registrations and Service Principals with APIM

App Registrations 📝

  • In App registrations we can create separate backend’s with the custom aad roles for apims. which is added as role assignments for the respective service principals (backend).
  • Since logic and function apps are added as member through aad role assignments so the apim will have access to these apps to trigger them and vice versa.

Its necessary that we expose our apim api’s in a secured manner , usually using a token auth in inbound policy is recommended.

So lets consider a example where I wanted to expose two api’s group -> cart and currency convert .

cart api => get and post api’s

currency-convert => get and post api’s

now I will be adding the inbound policy for cart api’s with the targeted backend ID (app registration’s client ID)

  1. Creating a backend app registration for the cart api’s and add custom roles
data "azuread_client_config" "current" {}

resource "azuread_application" "example" {
display_name = "cart-backend"
identifier_uris = ["api://cart-backend"]
owners = [data.azuread_client_config.current.object_id] # basically this should the service principal ID which you created for the terraform deployment
sign_in_audience = "AzureADMyOrgs"

api {
requested_access_token_version = 2 # Ensure to use version 2 for Access token
}

# AAD App Custom role assingments
app_role {
allowed_member_types = ["User", "Application"]
description = "Allows get requests to the cart api operation"
display_name = "cart.get"
enabled = true
value = "cart.get"
}

app_role {
allowed_member_types = ["User", "Application"]
description = "Allows post request to the car api operation"
display_name = "cart.post"
enabled = true
value = "cart.post"
}
}

# Password rotation
resource "time_rotating" "example" {
rotation_days = 90
}

# Password creation
resource "azuread_application_password" "example" {
application_id = azuread_application.example.id
rotate_when_changed = {
rotation = time_rotating.example.id
}

resource "azuread_service_principal" "example" {
client_id = azuread_application.example.application_id
app_role_assignment_required = false
owners = [data.azuread_client_config.current.object_id]
}

2. Add backend — client ID to the api inbound policy . Which sets the JWT authentication to your api.

<!-- Sample Inbound policy -->
<policies>
<inbound>
<base />

<validate-jwt header-name="Authorization" failed-validation-httpcode="401" failed-validation-error-message="Unauthorized. Access token is missing or invalid.">
<openid-config url="https://login.microsoftonline.com/<!--Use Tenant ID here-->/.well-known/openid-configuration" />
<audiences>
<audience><!-- use cart-backend client ID--></audience>
</audiences>
<required-claims>
<claim name="roles" match="all">
<value>cart.get</value> <!-- sample use of custom role for cart get api-->
</claim>
</required-claims>
</validate-jwt>
</inbound>
</policies>

3. Internal access — logic apps to apim

Add app role assignment for the logic app (ensure managed id is enabled for logic app) to the backend audience — from logic app to which apim api its going to call internally.

# Add app role assignment for the logic app or membership of logic app sp into the backend audience
resource "azuread_app_role_assignment" "example" {
app_role_id = azuread_service_principal.example.app_role_ids["cart.get"]
principal_object_id = azurerm_logic_app_standard.logic_app.identity[0].principal_id
resource_object_id = azuread_service_principal.example.object_id
}

Similar to Logic app you can also add function apps or service bus : to send messages directly to service bus through apim api , you have to set up necessary backend policy and ensure to add RBAC for the service bus into the apim (same follows for event hub)

4. External Access — Access APIM from postman or from external application

Create a Separate app registration and add app role assignment of the api service you want to expose.

Below is the example where I have added the cart api to the role assignment for the external app registration .

Now from external application or from postman :

you need to fetch the bearer token for the external app registration — which will be enough for you to access the internal api securely.

data "azuread_client_config" "current" {}

resource "azuread_application" "external" {
display_name = "external-audience"
identifier_uris = ["api://external-audience"]
owners = [data.azuread_client_config.current.object_id] # basically this should the service principal ID which you created for the terraform deployment
sign_in_audience = "AzureADMyOrgs"

api {
requested_access_token_version = 2 # Ensure to use version 2 for Access token
}
}

# Password rotation
resource "time_rotating" "example" {
rotation_days = 90
}

# Password creation
resource "azuread_application_password" "example" {
application_id = azuread_application.external.id
rotate_when_changed = {
rotation = time_rotating.example.id
}
}

resource "azuread_service_principal" "example" {
client_id = azuread_application.external.application_id
app_role_assignment_required = false
owners = [data.azuread_client_config.current.object_id]
}

# App Role Assignment

resource "azuread_app_role_assignment" "role_assignment" {
app_role_id = azuread_service_principal.example.app_role_ids["cart.get"]
principal_object_id = azuread_service_principal.external.object_id # external backend audience
resource_object_id = azuread_service_principal.example.object_id # cart api backend audience
}

Microsoft Entra ID — Azure Ad Groups

Azure Ad Groups 👮👮 —

  • These Az Groups has RBAC (role based access control) enabled for the resources like keyvault and storage accounts,
  • it also has all the logic and function apps as members by aad role assignments using system managed identity enabled, (which inturn provides access for all the logic and function apps to use respective resources).
  • In that way it avoids the repetitive RBAC access one by one for these logic and function apps.

Lets discuss with a simple example — by creating an Azure AD group for function app

  • Create a Basic Azure Ad Group
  • Assign RBAC’s to the Ad Group for the storage account and keyvault
  • Add Azure Function app as member to the Ad Group which will now have permissions to the storage and keyvault
data "azuread_client_config" "current" {}

# Azure Ad Group Creation
resource "azuread_group" "example" {
display_name = "function-app-group"
owners = [data.azuread_client_config.current.object_id]
security_enabled = true
}

# Add RBAC for the Storage account
resource "azurerm_role_assignment" "rbac_storage" {
scope = azurerm_storage_account.storage-account.id
role_definition_id = "Storage Blob Data Contributor"
principal_id = azuread_group.example.object_id
}
# Add RBAC for the Key vault
resource "azurerm_role_assignment" "rbac_storage" {
scope = azurerm_key_vault.key-valut.id
role_definition_id = "Key Vault Secrets User"
principal_id = azuread_group.example.object_id
}

# #################################################################################
# Add function app as member to the Azure Ad Group - which will have access to storage account and keyvault
resource "azuread_group_member" "example" {
group_object_id = azuread_group.example.id
member_object_id = azurerm_windows_function_app.function-app.identity[0].principal_id # ensure you enable managed identity during function app deployment
}

You can even add external-backend app registration as a member to this azure ad groups and from external application securely you can access the azure resources with bearer token.

Now lets discuss few real-time scenarios;😊🕒

Use Case Scenarios

Below I have given few sample implementation in which the Microsoft Entra ID set up which I have given above will be helpful in inter connectivity between the Azure Integration services.

Fig 1.1 Real Time use case Scenarios interdependency diagram

Consider a scenario where a external system / user calls the apim — api’s with SSL Auth and prerequisite is that their ip’s are whitelisted in our firewall and ensure your network security group has DDOS protection enabled

For example lets consider two implementations =>

🛠️Implementation 1 :

  • user makes a calls to apim and inturn the api’s triggers the logic or function app using HTTP trigger event and processes the input data and store it to storage account.

🛠️Implementation 2 :

Consider file is received in xml format and first using logic app its canonically mapped with json format using liquid map and then that json value is sent to the function app which in turn converts the json to targeted format and saves it into storage account.

  • In this scenario consider the input is received from external source system to storage account on daily basics now these data is considered to be transformed and stored in storage account.
  • for this use case the logic app is timer triggered based once the file is received in storage account on daily basis the logic app runs at 14:00 everyday which inturn calls the api’s
  • These api’s triggers the function app to maps the item values with various conditions and store it to storage account .

These two are sample scenarios — we expand this scenarios with various other azure resources like usage of service bus — Azure SQL — Azure cosmos DB and many more .

Conclusion:

I hope the above examples with the given deployment pattern can be effectively used. and it will enable Infra developers / DevOps to maintain the code base clean and secured .

Give it a 👏Clap if you enjoyed this content! 🤝 Don’t forget to hit that follow button for more exciting updates! Your support fuels my creativity! 🚀

Reference:

Terraform Azure Ad provider

--

--

DevOpsWithYoge

An enthusiastic DevOps professional ,I would like to help/share Azure Cloud aspirers and learners to know the aspect where Azure Cloud meets the realworld.