Serverless architecture: Driving toward autonomous operations

Ivan Campos
Slalom Technology
Published in
5 min readJan 11, 2017

Here’s why serverless architecture warrants your attention.

Ivan Campos | September 23, 2016 (originally published on Slalom.com)

Driverless cars will create more free time, decrease accident rates, and seek to automate-away traffic congestion. The same can be said for serverless architecture.

The term “serverless” doesn’t mean removing servers from your architecture. It’s about “abstracting users away from servers, infrastructure, and having to deal with low-level configuration or the core operating system,” says A Cloud Guru’s Peter Sbarski.

Modern cloud computing has become key to meeting today’s pressures of quality, speed, and cost. And with the shift away from on-premise or co-located solutions well underway, the ability to architect your cloud-based solution for agility will become a key competitive differentiator.

The benefits

Here are the benefits of adopting a serverless architecture:

1. It removes the need to manage servers

Cloud vendors’ serverless technology offerings provide high availability, fault tolerance, and auto-scaling by default. Horizontal auto-scaling can grow or shrink your fleet of abstracted servers (relative to demand) in a just-in-time manner without any IT intervention. And autonomous cloud operations can automatically manage capacity planning.

“No server is easier to manage than no server.” — Werner Vogels, Amazon CTO

With an infrastructure approaching infinite scalability and availability, there will be a significant drop in on-call support incidents. Delegating your server management also simplifies your physical architecture, because you can treat all of your servers as ephemeral black boxes.

2. It increases focus on what matters to your business

With server management out of the picture, developers have more time to focus on business logic. This focus is further intensified as functions become the unit of deployed work. Focusing on deploying independent functions (Function as a Service or FaaS) leads to evolving into a service-oriented architecture (SOA) and microservices when fronted by a serverless API gateway. With a concerted focus on individual functions, we also introduce de facto best practices, like separating concerns and adhering to the single responsibility principle.

“With server management out of the picture, developers now have more time to focus on business logic.”

3. It reduces costs, since you only pay for what you use

In a serverless architecture, you treat your Infrastructure as a Service (IaaS) costs like you would any public utility. Just as you only pay for water when you run your faucets, you only pay for your functions when they run in a serverless manner. The primary benefit of this approach is that you don’t pay for idle time on cold servers. And this generates an incentive to write code that executes as fast as possible.

Vendor alternatives

Several large cloud providers have already introduced serverless architecture enablers. Event sources that can trigger the serverless compute execution vary by service offering. Examples include: monitoring application logs; database changes; object uploads; and calls to APIs that front our functions.

Being event-driven ensures that our functions only fire when needed. Currently, the most prominent vendor solutions are:

“Just as you only pay for water when you run your faucets, you only pay for your functions when they run in a serverless manner.”

Considerations

We’ve covered several benefits of serverless architecture, but it’s important to understand that serverless architecture is a technology trigger moving toward a peak of inflated expectations.

Hype Cycle

An example of its inflated expectations is the movement to NoOps. While DevOps is a movement meant to foster communication and collaboration between software developers and employees working in operations, NoOps is when developers can code and let a service deploy, manage, and scale the code. NoOps is a divisive term signifying complete automation of operations — and it’s much too early to call for the dissolution of internal operations teams.

“NoOps is a divisive term signifying complete automation of operations — and it’s much too early to call for the dissolution of internal operations teams.”

It’s also important to understand that serverless architecture doesn’t fit all use cases. For example: long-running transactions may become an economic liability when you pay for what you use. If you’re looking for appropriate applications of serverless architecture, AWS has provided the following reference architectures:

  • Mobile backend: Mobile Backend as a Service (MBaaS) supports all solutions running on mobile devices. Using this blueprint, the cost model, agility, and scalability of a serverless architecture can be harnessed to power mobile client solutions.
  • Real-time file/stream processing: In the event that you’re being provided files or a stream of data, you can process what’s being sent over in real-time solely using AWS managed components (i.e. Lambda, Simple Storage Service (S3), Simple Notification Service (SNS), DynamoDB, Kinesis, or CloudWatch).
  • Web applications: For your browser-based application needs, a serverless architecture bypasses the headaches involved with site availability, scalability, and machine administration. You can simply create a static website using only S3 or a more dynamic application that can store data and derive actionable information.
  • Internet of Things (IoT) backend: As sensors pervade everyday objects, there needs to be a means to capture and analyze the flood of data. If you’re looking to automate or gain insight into behavior from sensor data, a serverless architecture can efficiently react to what our connected devices are sending.

Lastly, a crucial consideration is vendor lock-in. While you’re afforded the freedom to bring your own code, as long as your programming language is supported, there will be a natural tendency to also leverage ancillary serverless technologies of your event-driven compute vendor. As this intentionally occurs in the name of simplicity and time-to-market, your switching costs will become greater, thereby increasing the difficulty to move your serverless solution to another cloud provider.

Conclusion

When paired with rich front-ends, mobile clients, IoT devices, or even next-generation chatbots, a serverless architecture can serve as a simple yet cost-effective solution for your future projects — or as a convenient approach to breaking up monolithic legacy applications. It could bring a future where all of your infrastructure needs are met in a completely autonomous manner.

--

--

Ivan Campos
Slalom Technology

Exploring the potential of AI to revolutionize the way we live and work. Join me in discovering the future of tech