Azure Serverless Architecture Considerations

Mohan Nagaraj
Turbo360 (Formerly Serverless360)
10 min readAug 13, 2018

Within this article, we are going to dive deeper into different serverless architectures pattern and considerations. You should evaluate this to modernize existing architectures and take advantage of serverless computing.

Evolving Architectures

Architectures are bound to change. It wasn’t long ago that monolithic applications, including enterprise resource planning (ERP) were en-vogue. At this time, there was a perception that keeping all of your business modules within a single application made it easier to maintain and had less component sprawl. However, the dependencies within the monolith made it difficult to make changes to one part of the system, without impacting other parts of the system. It was also common to perform maintenance activities for long outages. Naturally, long lead times frustrates business units to implement new features. Also, long outages made it difficult to run their business.

N-tier architecture

It appeared, in part, due to the emergence of the internet where thin clients (web browsers) would serve as the user interface (UI) and make requests to a web server. By introducing a business logic layer between your UI and your back-end database, you had more options for scaling out your application. However, with these types of applications being able to scale out, you now had to manage a larger infrastructure footprint, especially when you consider having multiple environments (testing/QA/production) to maintain. While these applications provided some agility, by allowing to scale up or out different tiers in the system, they introduced expensive infrastructure deployments which made maintenance challenging and expensive.

Microservices

This really focuses on breaking down a solution into more discrete services which provide isolation from other components within the system. Microservices encourage developers to focus on writing code around specific business domains and having a service do only one thing, but do it really well! By using this approach you can independently manage and upgrade these discrete services. However, this level of granularity comes with some additional challenges. Like how to orchestrate all of these different calls to provide a cohesive experience for the end user. To address these orchestration needs, we do have API Gateways that provide some level of abstraction across these microservices. This also offers more agility as you can enable service-call redirection without modifying the underlying services.

While Infrastructure as a Service (IaaS) architectures provide some benefits for organizations including consumption-based billing, IaaS still involves a lot of management overhead to keep them up to date, from a patching and back-up perspective. Platform as a Service (PaaS) architecture addresses some of these management challenges by abstracting a lot of the underlying infrastructure details. But PaaS architectures are unable to solve true dynamic scale consumption. For example, an Azure App Service Web app is an example of a PaaS component. However, with this model, the website is always deployed as it needs to be ready to serve a request. Yes, the web app is able to dynamically scale out (and back) but there will always be an instance of it running. Subsequently, you will pay for it.

Serverless Architectures

Serverless architectures differ from these previous architectures, in part, by being event-driven. As a result, residual charges do not apply; you only pay for what you use based upon executions. Dealing with scale largely becomes an issue for the cloud provider. You need to honor the cloud provider’s service level agreement (SLA) offer. As an architect, the expectation is that your application performs well regardless of the load. Naturally, this requires trust by the customer organization but, also introduces the need to monitor the health and performance of your application. Serverless360 is a tool that can help with these monitoring needs.

Serverless Architecture Patterns

Here are some patterns that will suit to use serverless architectures.

Web applications

Web applications, especially single page applications (SPA) suits using serverless architectures. Within these types of applications, the UI typically makes back-end API calls to retrieve data. Then the web browser will display these data. In many circumstances, these API calls may function much like microservices do where they accomplish a single function. With this in mind, using Azure Functions may be a great place to host these API calls. Each of these Azure Functions can scale on-demand and similarly metered from a consumption perspective in the same manner.

The tooling for Azure Functions continues to improve since their launch. Support for debugging Azure Functions exists inside of Visual Studio and VS Code. In addition, using Azure Functions Core Tools allows for the development and testing of functions from a local computer.

Mobile back-ends

Deploying mobile applications to app stores can be a very time consuming and tedious process. By including business logic within Azure Functions, we can shift where this logic executes and minimize the deployment impacts of changes to that business logic. In addition, using event-driven triggers create new opportunities to interact with your end users. Instead of a mobile application polling for changes in the backend, event-based triggers can notify the mobile application of event or state changes.

Internet of Things

Internet of Things (IoT) deployments come in many shapes and sizes. In some situations, the “things” may be connected via a local area network like a vending machine. Whereas, in other circumstances, the “things” may be a mobile device that is directly connected to the internet. In either scenario, having a consumption-based trigger awaiting to service a request is advantageous from a billing perspective. Generally, IoT solutions produce a lot of traffic based upon the events that they are generating. As a result, there may be a need to do some level of data aggregation or calculations on these data streams. Using an Azure Function may be a good option to perform these calculations as Microsoft will take care of the scale demands imposed by the IoT devices.

More recently, a concept has been introduced called Edge computing. The need for computing on the edge of a network comes from the latency that naturally exists between a local network and connecting to a cloud instance using the internet (or even WAN connection). For some IoT scenarios, milliseconds count. To address these low latency requirements, Microsoft has introduced Azure IoT Edge modules which are capable of providing an Azure Functions host within a Docker container. While technically, this code doesn’t run in the cloud it is a powerful pattern that is worth calling out.

Integration

There are many opportunities for serverless integration solutions. For a moment, consider a traditional model of purchasing middleware. There is typically a long sales cycle in selecting the preferred vendor. Once you choose a vendor, the next question is how many cores/CPUs/compute do I need to license for? This has traditionally been a challenge for everyone. If lucky, you may have some existing integrations that you can use to forecast demand. But there is usually an element of uncertainty around what is the peak load. No organization wants to have their integration platform fall over when their peak demand occurs. As a result, people need to “build for peak” to be sure. This is a horribly inefficient way to deal with the problem. Serverless for integration completely changes the conversation. It puts the pressure on the cloud vendor to support your workloads on-demand and maintain their SLA.

Azure Logic Apps provides both the scale and micro-billing to enable use-cases for both organizations with mature and nascent integration practices. For organizations embarking on their integration journey, paying for exactly what you use is a great use of company capital. Imagine paying for an expensive integration platform and only using 20% of its capacity. With serverless, you can pay for only what you use. If you end up using more than expected, you will pay for it. But, in theory, you should only be integrating where it provides value so essentially this becomes a good problem to have. For example, at quarter-end, an organization typically processes 10 000 orders, but during this quarter they end up processing 20 000 orders. Sounds like a good problem to have.

Scheduling

Just because you have a serverless component, doesn’t mean that you need to depend on an external event to instantiate your component. For example, every 15 minutes call an HTTP endpoint and insert the result of that HTTP call into a database. Addressing this requirement can be accomplished using either an Azure Function or an Azure Logic App without having any dedicated compute.

Events

The Events represent a core use case for Serverless computing. Events are what is commonly referred to as reactive programming responds to events in other systems and instantiates a serverless component when it occurs. Azure Event Grid, Azure Functions, and Azure Logic Apps all support events as part of their core use-cases. For example, having Azure Event Grid subscribe to blob storage events. In order to notify an Azure Function or Azure Logic App to process the new data. This has been written to that blob storage is a highly efficient way of responding to those types of events.

Async background jobs

Just because serverless operations are supposed to be ephemeral, doesn’t mean that serverless components cannot participate in long-running processes. It just means that the architecture needs to be designed to take advantage of async capabilities such as webhooks. A webhook is essentially a custom callback that gets registered within a system that is fulfilling a request. Since this request may be long-running, the caller of this system may not want to block resources while waiting for their request to be fulfilled.

A webhook will essentially take what looks like a synchronous process and turn it into an async process. For example, we may have a batch of invoices to process by the enterprise system. A serverless app may pick up these invoices and deliver them to a downstream system for processing. But, while this system is processing these invoices, we don’t want our serverless app blocking waiting for a response. Instead, our serverless app can register a webhook with the enterprise system. Then request an update when the system has completed processing these invoices. These status updates can then be delivered to the publisher of the invoices.

Considerations

Serverless architectures provide a lot of benefits. But naturally, there are some use-cases where you require additional planning. When deciding whether or not to use a serverless architecture. Evaluate the following considerations:

Stateless

In a previous article on this blog, we borrowed a serverless definition from Martin Fowler which calls serverless components running in a “stateless compute container”. This makes good sense when you consider the micro-billing aspect of serverless. As a result, serverless solutions should be stateless. Since this may not always be possible, using components like Redis cache, Azure SQL or Cosmo DB may be good options for maintaining state. In addition, Durable Functions which are an extension of Azure Functions/Azure WebJobs allow you to write stateful functions within a serverless environment.

Long-running processes

Long-running processes may not be suitable for serverless architectures as well. For example, the default timeout for Azure Functions in a consumption plan is 5 minutes, while the outgoing request timeout for Azure Logic Apps is 2 minutes. This doesn’t immediately eliminate either of these technologies from participating in architectures that require long-running processes. However, it does mean that you need to work around these limits which may include using async messaging through other mediums including webhooks, Azure Service Bus Queues/Topics or Durable Functions.

Startup time

The upside of consumption-based billing is that you do not pay for resources when they are idle. The downside of this approach is that you may experience short delays when ‘waking-up’ these services when new events are generated. To address this, some organizations will ensure their service is warm by sending it periodic synthetic transactions. The alternative is to wait for a couple seconds when your service needs to wake up.

Scaling

As workload demands increase, the service will automatically scale to meet your needs. The triggers that cause scale events may include a schedule, request rates, CPU utilization or memory usage. This will incur additional resources and cost. It is important to understand what are these triggers that cause scale events so that you can architect your application appropriately.

Monitoring

Moving from monolithic applications into architectures that are made up of many discrete services, monitoring becomes a greater challenge. As events flow through various serverless components, having a correlation id included in the message payload is a good idea. Also ensuring that you don’t have a message sitting in a queue because a de-queue agent is not working properly is another scenario to watch.

Dependencies

Similar to monitoring in some regards, introducing many discrete services within an architecture can make inter-dependencies more difficult to manage. As soon as one service has a hard dependency on another service, trying to manage change becomes more difficult. You can introduce an API Gateway, such as Azure API Management. Using an API Gateway will provide an abstraction between services that allows for versioning and redirection.

Conclusion

In this article, we discussed the evolution of software architectures starting with monolithic applications to modern serverless architectures. We also discussed some patterns and opportunities when choosing a serverless architecture. Lastly, we reviewed some of the considerations that an architect should be aware of when evaluating serverless architectures. Serverless architectures offer many opportunities, but also introduce some new things to think about as you are modernizing your architectures.

Serverless360 is a one platform tool to operate, manage and monitor Azure Serverless components. It provides efficient tooling that is not and likely to be not available in Azure Portal. Try Serverless360 free for 30 days!

Originally published at www.serverless360.com on August 13, 2018.

--

--