Oasis Labs Partners with Meta to Assess Fairness for its AI Models using Cutting-edge Privacy Technologies

A first-of-its kind initiative enabling inclusiveness and fairness

The Oasis Labs Team
Oasis Labs
Published in
4 min readJul 28, 2022

--

We are excited to announce our partnership with Meta, and the launch of a platform to assess fairness in Meta’s products, while protecting people’s privacy. As Meta’s technology partner, Oasis Labs built the platform that uses Secure Multi-Party Computation (SMPC) to safeguard information as Meta asks users on Instagram to take a survey in which they can voluntarily share their race or ethnicity.

The project will advance fairness measurement in AI models, which will positively impact the lives of individuals across the globe and benefit society as a whole. This first-of-its-kind platform will play a major role in an initiative that is an important step towards identifying whether an AI model is fair and allowing for appropriate mitigation.

How the platform will assess fairness in AI models

Meta’s Responsible AI, Instagram Equity, and Civil Rights teams are introducing an off-platform survey to people who use Instagram. Users will be asked to share their race and/or ethnicity on a voluntary basis.

The data, collected by a third-party survey provider, will be secret-shared with third-party facilitators in a way such that the user’s survey responses cannot be learned by either the facilitators or Meta. The measurement is then computed by the facilitators using encrypted prediction data from AI models, that are cryptographically shared by Meta, with the combined, de-identified results from each facilitator reconstituted into aggregate fairness measurement results by Meta. The cryptographic techniques used by the platform enable Meta to measure for bias and fairness while providing individuals that contribute sensitive demographic measurement data high levels of privacy protection.

To read more about the platform, its objectives, and the launch, please read more here.

A vision of global inclusiveness and fairness in AI models for the benefit of millions

Meta and Oasis share a common vision around responsible AI and responsible use of data.

The cryptographic techniques being employed on the platform, at the scale at which they will be used, is unprecedented. This is the beginning of a new journey.

“We seek to ensure AI at Meta benefits people and society which requires deep collaboration, both internally and externally, across a diverse set of teams,” said Esteban Arcaute, Director of Responsible AI at Meta. “The Secure Multi Party Compute methodology is a privacy-focused approach developed in partnership with Oasis Labs enables crucial measurement work on fairness while keeping people’s privacy at the forefront by adapting well-established privacy preserving methods.”

Together with Meta we will explore further privacy-preserving approaches for more complex bias studies. Given the desire is to reach billions of people everywhere in the world, we hope to explore novel uses of emerging Web3 technologies, underpinned by blockchain networks. The goal is to provide further global accessibility, auditibility, and transparency around conducting and gathering survey data, and its use in measurement.

Professor Dawn Song, Founder of Oasis Labs, said: “We are excited to be the technology partner with Meta on this groundbreaking initiative to assess fairness in AI models, while protecting users’ privacy, using cutting-edge cryptographic techniques. This is an unprecedented use of these techniques for a large-scale measurement of AI model fairness in the real world. We look forward to working with Meta to build towards responsible AI and responsible data use for a fairer and more inclusive society.”

Oasis Labs Work & Mission

Responsible data usage and ownership have always been at the forefront of our core vision. We understand that in a Web3 world no entity can take user data for granted, and we are creating technologies to ensure data ownership and control is in the hands of the individuals.

Using blockchain, confidential computing, and privacy-preserving technologies, we have a vision to build platforms and products that further individual privacy protection, data governance, and responsible data use. Oasis’ technologies focus on making it easier for developers to incorporate privacy-preserving data storage, governance, and computation.

Decentralization and Web3 have the ability to reach individuals across the world. When combined with data privacy, this unlocks the ability for companies to reach a global audience, gain privacy-preserving usage of highly sensitive data, and build better products that treat everyone equally.

Stay Connected

If you would like to discuss potential partnerships and collaboration opportunities with Oasis Labs, please contact us here and stay up to date by following our Twitter account and subscribing to our newsletter.

If you would like to be kept up to date on developments with the Oasis Network, please join Discord, follow on Twitter and subscribe to the newsletter.

--

--