Less is More: Securing Payment Application with Serverless Architecture
by Jacky Xu, Software Engineer at Macquarie Group.
Serverless isn’t unprecedented, the ability to run stateless code on an event-based system has been available for years. These days, it gradually becomes the go-to technology for many tech enterprises like Netflix and Airbnb.
Why? Because you can hand-off server management and quickly scale to millions of requests in seconds, all fully managed by your cloud provider. It is a natural next step in the evolution of cloud technology because we can focus on innovating applications that really matter, rather than administering infrastructure.
Having said that, Serverless isn’t a silver bullet for all use cases — there is no single recipe for the implementation, it is always specific to the context.
In this article, we’ll look at how we use serverless architecture to host and secure single page applications for a payment system, and its benefits over the traditional architecture that uses the combination of load balancers and virtual machines.
Going back 30 years ago, Macquarie Business Banking division created a payment solution called DEFT to solve the pain points of our clients around cash handling. Since then, it has evolved into a modern digital payment system that operates 24x7 to provide an easier and safer way to collect and reconcile payments from their customers.
DEFT is a leading payment system in a number of professional services industries in Australia such as strata management, real estate, legal and insurance. In 2018, DEFT processed millions of transactions from more than 150 countries, worth billions of AUD.
Traditional architecture: lots of servers to manage
Historically we designed the frontend architecture in a traditional way where we operated more than 60 instances of Amazon Elastic Compute Cloud (aka. EC2s) to run web servers across multiple availability zones within one single AWS region. The web servers serve the single page applications to the users and run code to generate dynamic content to the web response.
These servers (i.e. EC2 instances) were hosting multiple single page applications across development, staging and production environments. As usual, the day to day maintenance activities to run these servers are unavoidable, such as operating system updates, patching, server logging, deployment, scaling and any other factors.
Moving to serverless architecture
While achieving the same goal, recently we decided to change our design to a more simplified architecture which removed the 60 groups of EC2 instances and replaced them with just a few AWS services: AWS CloudFront, AWS WAF, AWS Lambda@Edge and Amazon S3.
Here’s how it works. We host our single page applications in Amazon S3 (Single Storage Service) buckets and the application are served through AWS CloudFront distributions. AWS CloudFront is the AWS’s CDN (Content Delivery Network) service will then cache the application assets in the Edge location geographically close to the users. Before the web response goes to the users, Lambda@Edge will perform a few smarts on the application assets and add dynamic content. Additionally, AWS Shield provides DDoS protection and AWS WAF provides web exploits protection for the infrastructures.
Serverless is also known as Function as a Service (FaaS). This table compares the characteristics of common cloud services.
Lambda@Edge: brings serverless closer to users
You might already be familiar with the AWS Lambda which runs your code in the AWS cloud without provisioning or managing servers. In 2017, AWS announced a Lambda extension service named Lambda@Edge.
What Lambda@Edge allows you to do is to run Node.JS code at more than 150 Amazon Edge locations globally without provisioning or managing servers.
We deploy code into Lambda@Edge and have it interact with requests from our payment users at the Edge location that is geographically close to them. Lambda@Edge allows us to hook into web requests as they pass through Edge servers, and modify the web response in flight.
This enabled us to meet our dynamic content requirements without incurring the full round-trip latency to our web origin in Sydney region. For example, we created a few Lambda@Edge functions to add security headers to the web response and handle custom authentication and authorization. These functions and codes are replicated to all Amazon Edge servers over the world and executed at the location closest to users.
Security is always the top priority
Security is our key consideration when designing the architecture. As the system captures user payments data, we adopted the best practices from PCI DDS Compliance, and our system is compliant with PCI DSS Level 1 which is the highest security standard in the card payment industry. It’s important to note that these AWS services used in the new architecture are both PCI and SOC2 compliant, this enabled shared responsibilities and simplified compliance for us.
Below are some examples of security controls implemented in the system.
We use client-side encryption with RSA public key private key pair to encrypt the credit card data at the client browser so that the card number in clear text never leaves the client, and it will only be decrypted at the final stage of processing to minimize its exposure.
To ensure data is always encrypted in transit, we deploy the highest-class SSL certificate in AWS CloudFront that comes with extended validation and multi-domain so that it can be trusted. We also run Lambda function at the Edge location to inject the HSTS (HTTP Strict Transport Security) header in the web response to so that the connection can never be created without encryption in place.
Additionally, we inject X-Frame-Option in the web response header to prevent cross site scripting or payment page being loaded within iframe.
To sum this all up
We’re leveraging serverless architecture to gain the powerful advantage of improving costs and efficiency. Our infrastructure running costs are significantly reduced by removing many always-on EC2 servers, and we saved a large amount of time and effort from patching and maintaining them. Scalability and availability are now done for us using managed services like Lambda@Edge, CloudFront and S3 which reduced pressures from the team. At the same time, PCI compliance is simplified. These are all big wins for us.
It’s also worth noting these aren’t the only things you can do with Lambda@Edge, you can also implement dynamic web applications at the Edge, A/B Testing, user tracking analytics and much more.
Moving towards serverless means the team can now focus more on building new capabilities that really matter, and we can release new features to our clients even quicker.
Less is more.