Dropout Labs wins the Confidential Computing Challenge!

Ian Livingstone
Cape Privacy (Formerly Dropout Labs)
3 min readMay 2, 2019

--

We’re very excited to announce that our project, TF Trusted, is the winning entry to the Google Cloud + Intel Confidential Computing Challenge. This competition was a rallying call for ideas to enable our most sensitive workloads to take advantage of the scalability, availability, and cost-effectiveness of the public cloud without leaking data.

We can protect data at-rest or in-transit through encryption, but we don’t have the tools or technology to retain the confidentiality of data during computation. A host computer must have access to the data in plain-text to calculate a prediction or train a model. As a result, machine learning in the cloud is beyond the reach of industries such as finance, healthcare, and transportation, which hold some of society’s most sensitive data and impactful applications.

We developed TF Trusted, an open-source framework built on top of Asylo and TensorFlow Lite, for computing a prediction without revealing the model or input vector to the host computer. This is achieved by performing the computation inside of an Intel SGX device, a hardware trusted execution environment (also known as a Secure Enclave). The calculation can be performed in whole, as a TensorFlow Lite model, or in part, by extending the enclave’s computation as a custom TensorFlow Operation for use in broader TensorFlow graphs.

Diagram of TF Trusted’s Architecture

This project is an exploration of how enclaves could be used to retain the confidentiality of data throughout the data science lifecycle. For example, they could be used in or as an alternative to secure multiparty computation (MPC), homomorphic encryption (HE), or tokenisation techniques. Enclaves could generate triples for use in secure multiparty computation or enable support for operations that are incompatible with existing cryptographic protocols (e.g., LSTM support, a faster native ReLU, or softmax).

In future, we plan to integrate enclaves as a backend for TF Encrypted, a community-driven open source framework we’ve been developing for confidential machine learning in TensorFlow. The project’s goal is to democratize access to confidential machine learning techniques by taking what’s existed only in research papers and making it available to data scientists and machine learning engineers in a familiar way (e.g., TensorFlow).

Diagram of trusted execution environments as a TF Encrypted backend.

This competition has been critical in helping to evangelize the existence of secure enclaves and explore how they can help us solve real challenges facing companies trying to adopt the cloud. We’re very thankful to Google Cloud and Intel for organizing this competition and bringing this important topic into the public eye.

At Dropout Labs, we’re laser-focused on enabling companies and individuals alike to retain control of their data while accessing and creating machine intelligence. We believe this is the key to creating better algorithms, enabling new business models, and opening the door for society’s most impactful use cases of artificial intelligence.

You can get started today with confidential machine learning by following our Getting Started guide for TF Encrypted and by exploring TF Trusted.

Further Reading

About Dropout Labs

We are a team of machine learning engineers, software engineers, and cryptographers spread across the United States, France, and Canada. We’re working on secure computation to enable training, validation, and prediction over encrypted data. We see a near future where individuals and organizations will maintain control over their data, while still benefiting from cloud-based machine intelligence.

Follow Dropout Labs on Twitter and TF Encrypted on GitHub.

If you’re passionate about data privacy and artificial intelligence, we’d love to hear from you.

--

--

Ian Livingstone
Cape Privacy (Formerly Dropout Labs)

Co-Founder of @CapePrivacy and @Manifold. Early employee at @GoInstant (acq. Salesforce ‘12).