Abhishek
8 min readDec 12, 2018

Layered Architecture with Azure API Management, Azure Functions, Azure Key Vault and Cosmos Graph Database

Introduction

In recent Microsoft Connect event, API Management product team has rolled out new consumption based pricing model for Azure API Management Instance. This announcement is big step towards moving Azure API Management in same league as Serverless platform like Azure Functions, Logic Apps, Data Factory and Cosmos DB.

As an application architect, developers or owner what this announcement means to you? With new consumption based pricing model your enterprise can get started from basic and take full advantage of same feature set * which comes with fixed pricing model. With API Management facade layer in front of your API’s (new or existing), you can leverage complex capabilities out of box like security, caching, rate limit, versioning, content negotiation and so on.

In cloud first strategy choosing right set of tooling and platform is essential for overall success of any enterprise application development. In this article we will describe how you can leverage API Management along with Azure functions, Azure Key Vault and Cosmos DB to build highly scalable API endpoints which can be used across your mobile, web or desktop application. To get feel of how application should look like it is always great practise to draw and visualise it .Note this architecture is preferred for synchronous communication, if you are having async messaging pattern then include a messaging layer like service bus ,event hub and Kafka .The high level architecture for this article is shown below

Create Cosmos Graph DB database and Azure functions API using Gremlin.net

The first step here is to build instance of Cosmos Graph database .As the concept of graph and process has been nicely covered within Microsoft documentation at https://docs.microsoft.com/en-us/azure/cosmos-db/create-graph-dotnet and our earlier big data article https://medium.com/@abhishekcskumar/cosmos-graph-database-big-data-processing-with-azure-data-factory-functions-and-event-grid-31b056a285d7 we will not discuss this topic in detail and assume that you already familiar with concept of graph database concepts.

For this article we have loaded our cosmos graph with set of vertices and connected those vertices through edges with appropriate relationship (properties). We have used Async post operation on gremlin.net driver to load our cosmos graph database. If you are new then you can easily replicate the process within your web API, functions or console application by importing gremlin.net driver nugget package. The sample code to perform the post operation on graph database is shown below

Note here, in real world use case before you perform post operation to create a vertices or edge, do a GET operation first to see whether you required have POST, PATCH or leave the vertices as it is. This practise would save unnecessary updates to graph database and you can have optimum utilisation of your existing code.

As this above async function is called through Azure Functions HTTP bindings, we have used HTTP client to load sample data within our graph database .You can also use POSTMAN as simple tool to load the data in fly .

When you build your graph structure, always remember to set your vertices/node id as record unique identifier and you do not have any unique identifier for your node then set new guid() as unique identifier, this will help you to control duplicate record push in graph database.

Here you can test bulk insert feature with graph by just running single queryString to Import all vertices at once. You need to initialise graph connection once through g.addV () and once done you can append the queryString with more vertices details or edges properties

As part of this exercise we have created social graph with vertices starting from U001 to U015. Looking into graph structure through API we can easily find how vertex U001 is related to U015. Or which vertex is dense vertex with max number of edges in overall graph

Azure Functions to GET contact Vertices Information from Graph

Moving along with this article we have created a simple GET API endpoint through Azure functions which will take username as parameter and return relevant contact data from graph database. The query which will be executed on gremlin.net driver will be like g.V().haslabel(‘user’).has(‘Name’,’SampleUser1') Or

g.V().has(‘Name’,’SampleUser1')

And the azure functions which will call this async method is listed below. Here best practises is to set the HTTP verb to GET instead of default (GET, POST). The reason for this is, it will have a cleaner import operation when you will import azure functions definition within API Management.

Another thing to note here in above design is we have implemented azure key vault and managed service Identity as authentication mechanism to retrieve Cosmos DB connection properties. This way we have secured backend environment from any unwanted access. Previously to do this we had write extension method to fetch secrets from azure key vault like below.

Note here: With recent update from key vault and functions team now you do not need this extension method to retrieve secrets from key fault. You can simply access the functions application setting as it is and use @Microsoft.KeyVault function to get secret out of key vault at runtime. The process here is to store the secrets in key vault and use key vault secret http endpoint along with @Microsoft.KeyVault function to retrieve the value without changing a single line of existing code

So the new application setting should look like below

@Microsoft.KeyVault(SecretUri=https://wikisamplekeyvault01.vault.azure.net/secrets/hostname/6df428bc6390472293891f19813df2a3)

And the secret retrieval from functions would be as simple as reading host or application setting like below

One Azure Key Vault is configured and access policy has been set on key vault instance test the GET API through postman and verify the response

Next step to complete this article is to import the functions definition within consumption based API management and enable cache policy using azure redis cache.

Import Cosmos Graph database API functions in Azure API management

This process is straight forward . To perform this task first we need to create instances of redis cache and a consumption based API management instance.

To do this you go to your resource group, click on add blade and search for redis cache resource. Once found enter proper name for redis cache instance and select the right pricing tier as per your caching requirement. For this article and general purpose I personally start with lowest unit and then go to higher tier pricing plan as per business requirement

Once clicked on add blade it might take you couple of minutes to have your redis instance within your resource group.

Once we are done with redis cache resource creation , next step is to add api management instance with consumption plan and associate redis cache as external cache framework .To do this this time search for API management ,one found enter the required details and select pricing plan as consumption based

Click on create and within some minutes you can see a new consumption based api management instance is created within your resource group .To add external cache resource click on newly created api management instance and click on external cache . This will prompt you with cache import screen, you need to click on add and associate your redis cache instance within newly created api management

In the next step we will import the Functions API for cosmos graph and to do this click on the API and then select function app as source of our API. Once imported through console window you can see list of all API implementation within functions App with appropriate HTTP verbs

To enable caching on one of the GET API click on the functions api (in this case GetVerticesByName) and in inbound policy select cache response policy and set duration to 3600 and click on save blade so that your cached data is valid for an hour.

Now the last step is to test the imported API with caching policy on –To do this pass name as queryString parameter to API endpoint .This first request will be bit slow because the api retrieve the result from cosmos graph DB but the subsequent request with same query parameter will be blazing fast due to cached response .You can verify the cache in policy by looking into the trace section of api call

Conclusion

There are multiple benefits of using API management in front of your Cosmos database API some of them are listed below

· Throttling limit can be controlled at API level instead on database level

· Cache policy and cached response can increase overall application performance and decrease database hits

· Security can be controlled at multiple level ,External endpoint will be secured by api management while internal cosmos keys are secured within key vault

· Highly scalable with addition to external cache framework

· All the services are pass and Serverless offering so you pay for your usage

Abhishek

Integration Consultant | Author |Microsoft Azure MVP | TechNet Wiki |Proud Son