Beyond Disruption: Privacy Enhancing Technologies Can Deliver Financial Inclusion

The Center for Effective Global Action
CEGA
Published in
4 min readFeb 16, 2023

In the age of big data, artificial intelligence, and machine learning, is it really possible to keep data private? Does it matter? Dan Cassara, Project Manager for the Digital Credit Observatory (DCO), explores these questions and more in part one of this two-part series on privacy enhancing technologies and their role in financial inclusion.

Tanzanian woman calling a client by smartphone | yurakrasil

While more than half of individuals in developing economies reported borrowing money in 2021, only 23 percent did so formally, either from a financial institution or through the use of a credit card or mobile money account. Applying for a formal loan historically has been difficult for many people, especially those that live far from banks or lack financial documentation. Today, applications for digital credit products are reviewed instantly, automatically, and remotely, enabling more borrowers to quickly access the financial system, but often with high fees and at the risk of infringements on their privacy. In the future, borrowers from São Paulo to New Delhi will be able to quickly and remotely access credit without exposing private information like a contact list or location data, all while paying lower interest rates. Transparency and data sharing are needed to make this hypothetical future a reality. Trust and privacy protections — constructed with regulation or new technologies — need to be part of the solution.

The number of mobile subscribers in sub-Saharan Africa alone is projected to grow nearly 25 percent by 2025, representing an additional 120 million users. Each of these new subscribers generates personal data that is collected by mobile network operators and service providers. A few years ago, researchers discovered that certain non-financial data, often referred to as alternative data, could be used in place of traditional financial records to predict how likely people are to repay loans. For example, machine learning (ML) algorithms found patterns showing that making frequent international calls or having more Facebook friends than others in the same area was associated with a higher likelihood of repaying debts. This has enabled banks and fintechs to produce credit scores for hundreds of millions of people who own a mobile phone but lack collateral or access to a bank, which would otherwise prevent access to formal financial services.

This combination of research insights and new data has helped catalyze the rise of innovative business models that have expanded access to financial services to millions of people, with still untapped potential to lower costs and increase accessibility. However, the widespread use of personal data, especially from economically-disadvantaged people, raises concerns about the data’s collection and use, and the implications for the personal privacy of consumers. How do we protect private information and ensure new technologies lead to beneficial and equitable outcomes, particularly for those the financial system has historically harmed or excluded?

Central to protecting data privacy is developing a better understanding of how we can apply existing approaches for privatizing and safeguarding data. One group of technologies — Privacy Enhancing Technologies or PETs — could be paired with new datasets to enable data portability, information sharing, customized tools, improved market monitoring for regulators, and reduced cybersecurity and hacking risk. Though cryptographers began developing these technologies in the 1980s, it took recent advances in computing power to see them applied. For example, the US Census Bureau now uses differential privacy — strategically adding noise to statistics to obscure the underlying data that produced them — to protect the public. And, in the private sector, many big tech companies like Google, Microsoft, and Apple apply federated learning — running machine learning algorithms on multiple datasets stored locally on user devices rather than on a centralized dataset of many users’ data — to safeguard data.

Although PETs are gaining ground elsewhere, several barriers still block the path to greater adoption by fintechs. Financial service providers can deliver greater value to customers, and lower costs for themselves, by using PETs, but more testing is needed in real world settings to understand the tradeoffs and legal implications. For example, credit scoring methods that use privatized data could provide much needed consumer protection, but executives will need to understand how they affect profits. Moreover, firms could lower costs for the whole system by building shared infrastructure for detecting fraud or improving credit bureaus, but this would require data sharing that is not possible right now for both competitive and regulatory reasons.

Sharing data across platforms is becoming increasingly central to our digital lives, but unlocking the full potential of the ongoing data revolution will require greater transparency and usage of open data sharing agreements. In the digital financial services industry, new partnership models — for example, between banks and telecommunications companies — necessitate data sharing. And data sharing could play a key role in private sector innovation and government initiatives, such as utilizing telecommunications company data to monitor adherence to COVID-19 lockdown measures.

Despite the exciting possibilities of big data, many stories today reflect its recent challenges, like data misuse, data breaches, artificial intelligence (AI) exacerbating bias, and fragmented or outdated regulatory frameworks. Each of these cases harms people and erodes public trust in institutions and their ability to safeguard data. Digital citizens share data with an unspoken assumption of trust, and as the old adage goes, “trust takes years to build, seconds to break, and forever to repair.”

To rebuild trust and accelerate progress, financial service providers and regulators must work together to construct an infrastructure for transparency, data sharing, and privacy protection. And this process must include the voices of people whose data is being used to ensure their values are reflected in policies and that they benefit from the use of their data. Regulators around the world are moving forward with data privacy laws. To maximize the impact of these new protections, it will be important to leverage existing best practices and explore how PETs can enable a more secure future. We discuss this and more in part two.

--

--

The Center for Effective Global Action
CEGA
Editor for

CEGA is a hub for research on global development, innovating for positive social change.