Why addressing bias in machine learning is key to unlocking greater financial inclusion

Andrew Watkins-Ball
The Launchpad
Published in
6 min readApr 15, 2019

“Financial inclusion means that individuals and businesses have access to useful and affordable financial products and services that meet their needs — transactions, payments, savings, credit and insurance — delivered in a responsible and sustainable way.” — World Bank

At JUMO, our mission is to radically advance financial inclusion by bridging the gap between critical financial services, and those that need them most. Our customers are entrepreneurs, small businesses and tradespeople in emerging markets who have no traditional financial history and are therefore excluded from the mainstream banking ecosystem.

But with billions of people in emerging markets coming online for the first time, most of them via mobile phones, we now have an opportunity to build new forms of digital and financial identities that create new financial choices.

Doing this is hard. It requires making sense of unstructured data in a way that is simply not possible for humans to do manually. Instead, JUMO uses machine learning (ML) and artificial intelligence (AI) methods that can more accurately model a customer’s risk and connect them with appropriate financial choices. With this approach, we can reach millions of previously excluded people, but it will require much more work.

Financial exclusion is not straightforward. Cultural, gender and social nuances apply across markets and countries, adding a layer of complexity to any global effort to reduce poverty. That means any singular, algorithmic approach to the problem is likely to be prone to unintentional bias. To mitigate this, it’s important to include non-intuitive socio-cultural factors unique to the group of people whose data we are accessing.

Consider a farmer in the agricultural sector just before an anticipated drought. On a traditional credit model, their risk profile — and therefore lending prices — go up just as funding and crop yield decrease. If the farmer were to access a loan through the traditional credit risk model, they would carry a heavier repayment burden, making them more likely to default, thus the model is proven correct. This is a stark reminder of the real-world impact of algorithmic decision-making.

Put simply: there is a better way to manage this situation.

What if we accurately predict the drought, drop the price of the loan, and help the farmer get through this difficult period? This would increase the likelihood of the farmer qualifying for a loan at a rate that can reasonably be repaid, which decreases the risk of defaulting.

It’s essential, then, that while we work to expand financial access, we also continue to reflect on the data we use, the machine learning processes we apply and the impact this has on these markets. We recently focused on optimizing our approach to bias in our modelling and decisioning processes. Through this work, we developed a framework that we will continuously apply to what we do. This framework may be useful for others hoping to mitigate unintentional bias in other models.

Measuring impact and marginalization

The challenge of financial exclusion is vast, which is why global organizations like the United Nations are working alongside companies to tackle the issue. Naturally, to solve a problem you first need to understand its magnitude.

Perfect inclusion would mean that we extend financial choices to 100% of ‘good’ customers — in other words, customers who intend, and have the ability to repay their loans. At JUMO, we measure ourselves against the extent to which these theoretically ‘good’ customers can access financial products from local banks via our platform. To test our impact, we target a representative sample of otherwise declined customers, often called rejection sampling, to quantify an upper threshold of what perfect financial inclusion would look like.

Continuously interrogating the user data gives us access to a flow of unbiased information which we can use to minimize historical bias (that arises from historical decisioning) and avoid the unintended consequences of inaccurately excluding people. It’s important, however, that this process of measuring is regular and ongoing. It ensures our business remains aligned with our overall mission and avoids evaluating customers on the basis of the unintended bias that has entered the system.

Better data means better prediction

It’s important to remember that the result of any machine learning model is only as good as the data that is included. For example, we know that a disproportionate percentage of our platform users are male (70% compared to 30% female customers). But when we combine our data with other research, we find that women are only 10% less likely than men to own a mobile phone and gender plays a less significant role in the uptake and use of digital money.

So why were women unintentionally marginalized?

To understand this problem we needed to ask: what other information could we use to supplement current datasets that would help us differentiate between individuals in an apparently homogeneous group?

In this case, it was the understanding that in some markets women use mobile phones and mobile wallets less frequently than men. That’s largely down to socio-cultural factors, including lack of trust in mobile banking or prohibitive transaction fees. By taking the insight from these data and applying it differentially to women we can develop more appropriate risk criteria that may increase equality of opportunity.

ML and intuition for real world applications

In the technology industry we often view the world through data — 1s and 0s. We ignore the human impact of our work. To make real progress in advancing predictive methodologies, we need algorithms and intuition; we need to look beyond the short-term, trade-off the false positives and false negatives, and build products fit for people.

No one model can address decades of complexity or the richness of human social networks and their underlying subtle differences. In order to advance, we need to continuously challenge traditional thinking, and our own thinking.

In credit, which has remained mostly unchanged for decades, the way we view and assess risk has huge potential for progress. Machine learning can and will play a transformative role in our effort, so we’re predisposed to think about the upside. At the same time, we recognize the risks. This is why we’re focused on using AI to reduce bias in financial decision-making while knowing that if applied poorly, it could have the exact opposite effect.

A compassionate approach to financial inclusion?

By adopting this approach, we recognize the need for an evolving set of criteria that maximizes individuals’ ability to express their potential. By doing this, we may finally break the direct link between credit scores and wealth by using a non-unilateral approach to the evaluation of risk. This means a merchant in Kampala taking $100 of working capital and a merchant borrowing $10,000 in Sacramento are evaluated based on the most appropriate risk-assessment criteria for each individual, opening up a world of financial choice.

The unintended consequences and bias that arise as a result of decisioning, often involving ML, are not only a result of the algorithm or modeling process, but involve all accumulated touch points that result in the final outcome. To address bias you can’t simply observe and correct for the symptoms. You have to understand the end-to-end process and dissect the entire system in order to get at the root cause. Once you’ve done this, you need to keep doing it — continually testing, learning and optimizing to remove bias from your models.

Ultimately, this can be instrumental to empathetically employing technology to expand and deepen financial inclusion.

Andrew Watkins-Ball is the founder and CEO of JUMO, the largest and fastest growing technology platform for operating inclusive mobile financial services marketplaces in emerging markets. JUMO partners with forward thinking banks and mobile network operators to connect consumers and small businesses with financial opportunity. JUMO combines data and technology to deliver products designed to reach and fit the 80% of the world’s population that are un(der)served by traditional financial services.

Several members of the JUMO team contributed to this work, including Ricki Davimes, Anthony la Grange, Clarissa Johnston, Ben Gidlow, Natu Lauchande, Niklas von Maltzahn and Paul Whelpton.

--

--

Andrew Watkins-Ball
The Launchpad

Andrew Watkins-Ball is the CEO and Founder of JUMO, which is tackling financial inclusion using technology.