Photo by Umberto on Unsplash

Deep dive in Confidential Computing

Rahul Grover
6 min readApr 28, 2023

--

This is first of a multi-part series in Confidential omputing deep-dive

Introduction

As software engineers, we spend a lot of time putting appropriate security practices in place, ensuring that our code is secure at the outset. This includes, and not just limited to, shifting security to the left (earlier) in the development cycle using tools such as SonarCube, Fortify, Orca Security, Prisma Cloud or Snyk amongst others. When organisations move their applications in a cloud environment like AWS, it becomes their responsibility to ensure that they are adhering to the Shared Responsibility Model for Security ‘In’ Cloud. One of the key elements of the responsibilities for customers is to ensure that all interactions with the Data layer are secure and effective. AWS provides a number of services to ensure data privacy, integrity and security at rest and in transit to ensure that organisations are able to implement a comprehensive encryption strategy across their cloud landscape. Some of these services include AWS Key Management Service (KMS), AWS Certificate Manager (ACM) and AWS CloudHSM.

Third Dimension to Encryption

Though it’s great to ensure that our data is secure at rest and in transit, there is one dimension that we often forget about, securing data when is it in use. So you may wonder why is that important ? Well, your application is required to get data from a source, do processing and send the output to an endpoint — database, API response, events etc. For simplicity, let’s consider an application deployed onto an EC2 instance which processes customer info like name, address, data of birth, health care details or any other Personal Identifiable Information. What happens when a malicious actor breaks-glass into the EC2 and does a memory dump of the process running the application ? All the sensitive information is at threat actor’s disposal in plain text — jackpot ! Ensuring that such workloads can be appropriately protected when sensitive data is being processed brings encryption in use to life and covers the gap the exists with current mechanisms.

3 dimensions of Security — in-transit, at-rest and in-use

What is Confidential Computing Consortium

“You are as strong as your weakest link”, goes the old adage and this is quite relevant when it comes to software security products relying on a hardware they are installed on. What happens to such products when the underlying hardware gets compromised ? It opens up the door for threat-actors to access sensitive information in an environment.

To mitigate this challenge with securing the underlying hardware and sensitive software running on it , a number of hardware vendors, cloud providers came together and collaborated on accelerating the idea of secure computing at the hardware level through the adoption of Trusted Execution Environments (TEE) technologies and standards. The Confidential Computing Consortium (CCC) was formed in 2019 as a project community under Linux Foundation and had Intel, Arm, Google, RedHat, and Microsoft as some of its premier members.

To get a much deeper insight into CCC, I highly recommend you go through the overview of CCC and current projects that are currently being worked upon.

Confidential compute options from hardware & cloud providers

Confidential Computing at AWS

Confidential Computing at AWS is defined as the use of specialised hardware and associated firmware to protect data while it is being processed from any unauthorised access

AWS provides Confidential Computing through :

  1. Nitro Instances — where customer can run workloads without worrying about even the cloud operator (AWS) getting access to the underlying instance
  2. Nitro Enclaves — provides highly isolated and secure compute environment to protect & run highly sensitive workloads
  3. Nitro TPM (Trusted Platform Module) — provides cryptographic capability to manage encryption and attestations required to either secure code or other relevant operations

We will focus on Nitro Enclave for this blog and understand how it can help solve a real-life scenario.

A real-life use-case

CyberArk is a well recognised Privileged Access Management solution in the industry which helps organization manage their privileged credentials effectively across a wide variety of infrastructures. These privileged credentials could be service accounts required to access a highly sensitive dataset or server. If your organisation loves automation, one of the mechanisms that application teams interact with CyberArk during runtime is via a service called CyberArk Credentials Provider or CCP which is a part of the Password Vault Web Access component. Service accounts pertaining to a service are bundled into an application construct and is accessible via a client certificate, private key to which is only available to the Application itself. The structure is represented in the diagram below.

CCP account & certificate access setup

In the example below, one of the application deployed on EC2 is required to retrieve password for a service account from CyberArk. It would need a client certificate in its keystore/path and would use it in context during the API call by specifying the ApplicationId (Application 1) and UserName (rds_app1_service1_user1). Once authenticated and authorised, a response with the sensitive secret is returned back to the application.

GET <cyberark_server>/AIMWebService/api/Accounts?AppId=Application1&UserName=rds_app1_service1_user1

A general CCP API workflow

Now let’s say an organisation uses a break-glass process for access to Production Workloads, where access is granted just in time, as required. What happens when a malicious user breaks glass into the EC2 and makes the following API call?

GET <cyberark_server>/AIMWebService/api/Accounts?AppId=Application1&UserName=rds_app1_service2_user2

The user is able to retrieve a sensitive credential with little to no effort and these credentials can then be used to adversely impact other systems.

Threat actor able to retrieve credentials for a different account

How does Nitro Enclave help ?

The aforementioned issue is exactly the type of use-case that Confidential Computing is so effective in — encrypt data in use and remove access to sensitive information for users. Nitro Enclaves allow users to create isolated compute environment from the EC2 instance to allow processing of the highly sensitive data. This means that there’s no ingress from the EC2 instance into app deployed in Enclave, eliminating any ssh or admin level access concerns. In order for the app to access AWS or other services, there’s egress allowed via vsock and vsock-proxy — which we will delve into details into Part 2 of this blog series.

In order to secure the sensitive code in the above example, the app is bundled in an image and deployed onto a secure nitro enclave. This provides the additional isolation and security required to run sensitive processes without worrying about external actors accessing the information. To be crystal clear, even if a user assumes a root user access in the EC2 instance, they are still unable to get access to the underlying process/code running in the enclave which provides the level of desired isolation needed given the sensitivity of the API transaction.

Application to retrieve privilege secrets from CyberArk deployed onto Nitro Enclave
Threat-actor can’t get access to the Nitro Enclave and secure certificates on it

Conclusion

The use of Trusted Execution Environments (TEE) provides the isolation and security required at the hardware level to run workloads in a secure manner. There are a number of use-cases that can benefit by using Confidential Compute. Some of these include :

  • Securing sensitive secrets such as certificates (alluded to in the above example)
  • Batch processing of financial data in Financial sector (banking and insurance) or health data in the Healthcare industry
  • Training ML models by ingesting and processing data from different sources (even competitors) without worrying about sharing
  • Processing data in government agencies — sharing data between different agencies without worrying about exposure

I’m fascinated by this amazing technology, that is now available across a number of hardware and cloud providers. I’m pleased to see the collaboration thriving between cloud and technology providers as they drive the evolution and adoption of this technology.

References

Confidential Computing Consortium

AWS Nitro Enclaves

Part 2 of this blog series will focus on a sample app deployed onto an AWS Nitro Enclave

--

--

Rahul Grover

AWS Ambassador | AWS Community Builder | Director, Platform Engineering