Algorithms and Impact: Early CEGA Evidence on Digital Credit featured at the ASSA

The Center for Effective Global Action
CEGA
Published in
5 min readJan 27, 2021

This post summarizes algorithm development and digital credit impact insights from a digital credit-focused panel at the Annual Social Science Association’s (ASSA) annual meeting. It was written by CEGA Program Associate Marisa McKasson, with contributions from Senior Program Manager Leah Bridle. CEGA acknowledges DCO Scientific Director Jonathan Robinson for his efforts to organize this panel.

Across the world, 1.7 billion people lack access to formal financial services, but two-thirds of adults own a mobile phone. Digital credit uses alternative data sources, such as the data produced by an individual’s mobile phone, to create alternative credit scores and distribute loans to consumers in an instant, automated, and remote fashion. Are these products fairly distributing loans to borrowers that would benefit from this financial service?

Since 2016, CEGA’s Digital Credit Observatory (DCO) has funded researchers to both design and test the algorithms underlying digital credit and rigorously evaluate how this new type of loan impacts low-income consumers. This year, we’ll begin sharing learnings and policy insights from our sixteen funded studies spanning three continents.

On January 4, 2021, we kicked things off with a dedicated panel at the Annual Social Science Association’s (ASSA) annual meeting featuring four of our DCO-funded evaluations. This entirely DCO-funded panel of research was the first opportunity for DCO teams to present preliminary results in direct conversation with one another, and share early evidence with the ASSA scientific community. In particular, we discussed:

How do we design algorithms that fairly and effectively assess credit-worthiness?

Two potentially critical problems that could arise when using algorithms to make loan decisions (and other decisions using machine learning) are:

  1. When not transparent, algorithms make “black box” decisions: this limits individuals’ and regulators’ ability to assess potential bias or errors (in this case, are there borrowers that are unfairly refused a loan?)
  2. However, if too transparent, algorithms can be manipulated: in the case of digital credit, users could digitally behave in ways that artificially increase the assessment of their credit-worthiness, which would lead lenders to extend credit to borrowers with less capacity or intention to repay than predicted. This could undermine the viability of providing digital loans.

Björkegren (Brown) presented this framing to discuss the results from joint-work¹ on “manipulation-proof” algorithms. To determine whether the research team could mitigate these problems, they built an algorithm that assumed people game their desired outcome and then tested it in an experiment in Kenya. They found that new users in the experiment manipulated their digital behavior when incentivized, but that it is possible to design decision rules that are more transparent without sacrificing efficiency. To learn more, read the full working paper.

Vendor in the Dominican Republic. (Credit: Sean Higgins)

Sean Higgins (Northwestern) also presented his joint research on gender-sensitive credit scoring.² Focusing on the “black box” problem, he called attention to the fact that traditional credit scoring models use data that may be biased against women, especially lower-income women with limited credit histories. In their proof of concept work, 80 percent of low-income women in the Dominican Republic score higher with an algorithm that takes into account both gender and the ways that gender interacts with the other variables in the model to better predict creditworthiness. They aim to use call detail records and app use data to refine the new algorithm and then test whether increased access to credit positively impacts women.

Stay tuned for more detailed reports outlining how digital credit algorithms can be designed to be more transparent, less biased, and better for consumers!

But how does digital credit determined by algorithms actually impact consumers?

Even once the potential problems with credit-scoring algorithms are addressed, concerns around consumer protection remain, including but not limited to:

  1. Does credit delivered within minutes or hours lead to high default rates?
  2. Do consumers know the terms that come with these automatic loans?
  3. Do digital loans lead to overindebtedness?

Alfredo Burlando (University of Oregon) shared joint-work³ aimed at understanding the first concern. In this evaluation, researchers studied the effect of imposing delays in the delivery of digital credit. They found that when consumers had to wait longer for their credit, repayments increased by eight percenta twenty-one percent decrease in the default rate. To learn more about this exciting study, including what we can and can’t draw from this analysis for other applications, register for a DCO webinar with the research team at 8:00am PST on February 1st, 2021.

Airtel Malawi Village center in rural Malawi. (Credit: Guillaume Kroll)

Valentina Brailovskaya (IDinsight) presented joint-work⁴ aimed at exploring the second and third concerns in Malawi. The research team found that consumers liked the local digital credit product, even with its high-interest rate, but generally did not know the terms and conditions. An interactive voice response training increased consumers’ knowledge about the product, and the increased knowledge led consumers to borrow more. Overall, the research team found substantial demand for digital credit, some evidence of positive effects, and no evidence of negative effects (such as negative financial wellbeing).

We look forward to summarizing these important results in greater detail as the researchers finalize their analyses. In the next year, many of the studies funded under the DCO portfolio may generate insights that policymakers, practitioners, businesses and funders can use to improve digital credit and related financial services. CEGA will highlight lessons emerging from individual evaluations and across multiple studies, as well as the drivers that help explain new findings — and we will continue to share results as they become available. In the meantime, please subscribe to our newsletter, listen to installments of our DCO Webinar Series, register for upcoming events, and stay tuned on this blog.

Do you have questions, related evidence, or connections related to these studies or the policy implications of digital credit? We invite you to reach out to the Digital Credit Observatory team at digitalcredit@berkeley.edu.

[1] With Joshua Blumenstock (UC Berkeley, DCO Scientific Director) and Samsun Knight (Brown).

[2] With Joshua Blumenstock (UC Berkeley, DCO Scientific Director), Laura Chioda (UC Berkeley), and Paul Gertler (UC Berkeley).

[3] With Michael Kuhn (University of Oregon) and Silvia Prina (Northeastern).

[4] With Pascaline Dupas (Stanford) and Jonathan Robinson (UC Santa Cruz).

--

--

The Center for Effective Global Action
CEGA
Editor for

CEGA is a hub for research on global development, innovating for positive social change.