How enterprises like Verizon are solving the challenges of serverless adoption

Is your enterprise adopting serverless at scale? Not so FaaS!

Rajdeep Saha
A Cloud Guru
4 min readJul 31, 2018

--

As a cloud architect at a Fortune 100 company, I’m observing a lot of startups going “FaaS and Furious” on serverless adoption — along with major enterprises like Capital One, Expedia, Nordstrom, and Verizon.

Base on the many advantages compared to traditional architectures, it wasn’t a big surprise to learn that our own development teams at Verizon were eager to leverage Functions as a Service (FaaS) within the broader context of serverless design patterns.

Developers understand that serverless architectures will save them time, control operational costs, reduce unnecessary maintenance, and promote innovation within our teams — but, not so FaaS!

Most large enterprises have a significant number of stakeholders that must be considered when driving the adoption of new tools and processes. In today’s environment, the chief concerns of enterprise include: adhering to security standards that protect data and customers privacy, improving the velocity of application development to respond quickly to customer needs, and promoting the reusability of code to drive operational effeciencies.

Let’s take a quick look at how these commons concerns relate to serverless adoption — and see what organizations can do to mitigate the risks by asking the right questions early in the design process.

Data protection remains a top priority

Data security is of paramount importance to any enterprise — and this means keeping data protected at rest and in transit. Moving to serverless doesn’t make your system any less secure, you just need to integrate security into your architecture and design patterns.

When considering serverless adoption, enterprise need to keep some key elements in mind as you continue to enforce governance and surveillance to protect sensitive data and customer privacy.

  1. Customer-managed Key Management Service (KMS) keys are preferred over vendor managed keys. Be sure to rotate your keys periodically.
  2. Encryption is a required features for any serverless tool or service.
  3. Everything should be encrypted using customer managed KMS keys — and each app should have a their own KMS key.

Every service used within your serverless architecture should support these core features — if not, consider looking for an alternate.

Isolate your apps

When migrating to a serverless environment, it’s important to design the architecture so they work in a singular manner — especially if your apps reside in the same AWS account.

How are you isolating one app from another in a serverless architecture? If App A and App B are using Proxy A and Proxy B, how do you ensure they don’t accidentally call or change each other’s proxies?

Be sure to ask these questions early in the design process and long before migrating any workloads — and then test and test again to prove these capabilities are functioning properly after developing your architecture.

API Gateways in enterprises

When you leverage API Gateway, Lambda, and DynamoDB — chances are it’s not going to work out-of-the-box in your enterprise. Some of the constraints that create issues for some enterprises include:

  • DynamoDB doesn’t support Customer Managed Keys — nor can it easily isolate specific tables
  • All endpoints are visible in API Gateway to anyone on the account
  • As a result, it’s likely that a custom authorizer will be built.

At Verizon, here’s just a few of the questions we started to ask while evaluating API Gateway services:

  • Is OAuth2 integrated into the platform natively?
  • Is the product platform agnostic?
  • Can it support existing on-premises APIs as well as cloud APIs?
  • Does it have better app segregation in a multi-tenant environment?
  • What types of in-depth analytics are available?

Logging & Monitoring

Native logging service — such as Amazon CloudWatch — can certainly get the job done. But when your company generates millions of lines of logs every hour, it’s a bit cumbersome to search through the logs when troubleshooting an issue.

When solving for serverless logging and monitoring, our team looked for a few key features:

  • Can the service be integrated with lambdas near real time?
  • Can the product accommodate applications that are hybrid, i.e. present in both on-premises and in the cloud?
  • Can it be integrated with existing centralized logging/monitoring platform for minimum learning curve for the engineers?

Building awesome with serverless

When dealing with enterprises of more than 10,000 engineers the big question is: how do you drive a culture of serverless across the entire organization? It’s not an easy feat.

Moving a “handful” of applications to serverless is a great start — but our goal is to move the majority of workloads to a serverless architecture.

Driving a cultural shift towards serverless is vital for success. Rather than relying on a top-down mandate from leadership, we’ve seen more effective and lasting results once developers authentically embrace the architecture.

What’s worked for us? Consider serverless workshops, or sponsoring an internal hackathon to engage your engineers. It’s important to encourage innovation and hands-on experience as an integral part of adoption.

Hands down, the most important takeaway from our experience with serverless was getting all parties engaged from the start — and iterate, iterate, and iterate.

We worked to get teams involved from governance, security, operations, etc., and made sure everyone understood that our first designs were never the final ones. Our teams embraced the philosophy of failing fast, moving forward, and iteratively applying our learnings into better designs.

This is just the start — we fully expect our designs to change along with our processes, people, and the rapidly evolving serverless technology. The Verizon engineering team is inspired by the serverless community, and thrilled to be on the serverless journey with you. Keep building awesome!

--

--

Rajdeep Saha
A Cloud Guru

Sr. Container/Serverless SA @AWS | Author | Public Speaker | YouTube Channel “Cloud With Raj” on Container, DevOps, and interview prep | Opinions are my own