Serverless applications remove a lot of the operational burdens from your team. No more managing operating systems or running low level infrastructure.
This lets you and your team focus on building…and that’s a wonderful thing.
But don’t let the lack of day-to-day operational tasks fool you into thinking that there’s nothing to do but write code. With a serverless design, you still have operational tasks. These tasks are different and tend to be more directly tied to delivering business value but they still exist.
What about security?
Anytime you’re discussing deploying an application in the cloud, security is top of mind. Serverless applications are no different in this aspect but they are radically different in how you have to implement security.
The good news is that security for serverless applications boils down to four key areas;
- Flow of data
- Choice of services & APIs
- Code quality
- Monitoring production
Each of these areas is critical to the overall security of your serverless application. But before we dive into the specifics of these areas, it’s important to understand the basic security model for all cloud services.
When you don’t have to deal with servers and other infrastructure components, it’s easy to forget that they still exist.
Understanding the basic security model of how day-to-day responsibilities are shared between you — the user — and the cloud service provider is crucial to securing your serverless application.
Shared Responsibility In The Cloud
So how does this all work?
While the ultimate responsibility for security is yours (you are the one delivering the application, aren’t you?), the day-to-day operational responsibilities for various areas are divided between you, the user and the cloud service provider.
This division changes depending on the type of service you’re using.
For infrastructure services (IaaS) you take over the day-to-day responsibilities at the operating system level. With PaaS offers, you handle the application and your data.
With SaaS and — by extension — serverless designs you’re responsible for the day-to-day security of your data.
And since there isn’t much direct action that you take to ensure your application’s security, most of your responsibilities fall to to your overall design and how you use the services that make up your application.
This is a radical change from the traditional approach to security.
Mapping Data Flow
The first step to securing your serverless application is mapping how data will flow between components — the services and APIs you’re using. Figuring out where data is processed and where it’s being stored will guide you in the choices of what services and APIs are right for you.
For example, if you’re storing personal information for your users, that’s sensitive and you want to make sure that it’s encrypted at rest and in transit. That requirement now defines what type of data storage service you can use.
You’ll see the value of this map in a number of ways, including making it easier to troubleshoot issues and performance problems as well as in step #4 (monitoring production).
Choosing The Right Services & APIs
Requirements for data security in hand, you can start the work of selecting the right services and APIs to build your application.
These choices are the building blocks of your application and if you don’t choose ones that meet your security needs, you’ll will never meet your overall security posture (a/k/a your security plan & practice) goals.
The “right” service — from a security perspective — is going to be dictated by the security controls that the provider makes available to you.
For example; you want to store data in a simple storage mechanism. You could use Amazon S3, Google Cloud Storage, or Azure Storage to name a few. But on their own, none of these encrypt with a key you control (*Google Cloud Storage has this feature in beta).
To encrypt the data at rest with your key, you’re going to have to use an additional service like AWS KMS or Azure Key Vault. That’s not a deal breaker but it is something you have to account for in your code which has an impact of your design and the resources you need to implement it.
The map of your data flow will let you quickly determine which services are the right fit for your application.
By far the most effective tool for security (in any type of design) is high quality code. The difference in a serverless design is that you’re limited in the number of security tools you can use to compensate for poor quality code.
When you’re running code on a server you manage, you can mitigate problems like XSS or CSRF with a control like intrusion prevention. In a serverless design that might not be possible because you don’t manage the layers where these controls would be implemented.
The best strategy is to ensure you have strong code review, testing, and continuous integration strategies that help ensure you and your team are delivering consistently high quality code.
You’re never going to eliminate bugs but these processes will help reduce them and reduce the time to discover the bugs that do exist.
Once you’ve deployed your application and users start interacting with it, you should be monitoring the security of the application as well as it’s performance.
Performance monitoring is a decently well understood problem space…security is a bit more arcane.
Even in traditional deployments, security monitoring tends to quickly end up with a team drowning in data and no real actionable information.
Fortunately for serverless designs, you can omit most of the noise because you are not responsible for the day-to-day security management of those components.
That lets you monitor three main categories;
- access control
- integrity monitoring
- odd business behaviours
When you’re looking at access control, you want to make sure that only the entities (systems and users) you’ve explicitly granted access to are the ones accessing your application and it’s components. You want to setup alerts and processes to respond to anything out of the ordinary.
This ties nicely with integrity monitoring. In a traditional application, integrity monitoring typically scans the file system and examines any changes against a set of rules.
In the serverless world, integrity monitoring involves ensuring that your functions aren’t changing unexpectedly and that the your data is actually what you expect it to be. Given the level of access you have to these components, you can see why it’s tied tightly to access control.
The final area of monitoring is perhaps the trickiest but also the most interesting.
Due to the nature of the application’s design, you can build in a set of thresholds for security attached directly to your business processes.
In some applications, you could look for multiple orders placed in quick succession by the same users. You could look for transactions well outside of “normal” for a given user. If you’ve built a game, you could look for players who are having success outside of the capabilities of their characters.
The goal here is to flag things that are out of the ordinary for your application and provide enough information for the team to dig in and determine if it’s a security issue, a bug, or simply new behaviour from your users.
Successful Security for Serverless
Serverless applications significantly reduce the operational burden on you and your team. That’s one of the key advantages of this approach.
The burden of security is also greatly reduced but you still have to understand the challenges and risks facing your application.
The 4 keys are;
- 🔑 mapping how data flows in your application and what protection it needs from each services & API it interacts with
- 🔑 based on the security requirements of your data, choosing the right set of services & APIs
- 🔑 ensuring that your have the processes in place and the right team to deliver high quality code consistently
- 🔑 monitoring key aspects of production like access control, the integrity of your code & data, and key business metrics for security anomalies
By addressing these 4 key areas, you can make sure that your serverless applications are secured appropriately with minimal effort.
When done well, cloud deployments can be more secure then traditional on-premises ones. This is due to the fact that you’re sharing day-to-day security responsibilities with the cloud service provider. Their dedicated operational teams should be — if you’ve picked the right provider — among the best in the world.
Serverless applications shift more of the day-to-day responsibilities to the provider making it even easier to make sure your application has world class security.
How are you tackling security in your serverless applications? Tag your response below or message me on Twitter where I’m @marknca.