REUTERS/Albert Gea

Up for debate: Should Mac users get better rates on loans?

The Brookings Institution
9 min readMay 23, 2017

By: Aaron Klein

New financial technology (FinTech) has the promise to provide credit more efficiently, effectively, and fairly to millions, including minorities who continue to be underserved by traditional financial institutions. However, the same FinTech that shows this promise is testing 1960s and 1970s era anti-discrimination laws and regulations in ways never before contemplated. This conflict will ultimately require a rethink of how we define and implement anti-discrimination laws in a FinTech future.

To better understand this dilemma, examine a simple and real-world question: When two people apply for a credit card (for example), should it be legal to offer them different interest rates based solely on the fact of whether they are using a Mac or PC? Would your answer to this question change if you knew that there were racial differences between Mac and PC users? Gender differences? Or if the type of computer you use is correlated to your likelihood of repayment?

As new FinTech technologies are applied using big data and non-traditional criteria to determine credit worthiness, these types of questions will need to be answered:

  • What information should lenders be allowed to use when providing access to credit?
  • How can regulators set and enforce those rules?
  • While preventing discrimination is a laudable and worthwhile goal, at what point in preventing discrimination through regulation do we impinge upon risk-based pricing?

BACKGROUND ON PRICE DISCRIMINATION

For the Mac/PC thought experiment, assume the following two things are true about Mac users. One: All other things being equal (income, debt, education, assets, etc.…), Mac users are slightly better credit risks. Even after controlling for other factors, statistics using big data show a correlation that Mac users are more likely to pay you back. Second: Mac users are disproportionately white. [1]

In the world of online lending, there is a trade-off between three key elements: cost, speed, and veracity of information. Borrowers can provide data and lenders can either accept it as true, or spend money and time verifying it. Lenders can pull data from consumer’s histories, but both traditional credit reporting and the use of non-traditional sources such as social media costs time and money. And for many consumers, the information is either not available or insufficient. But there is one piece of data that is available immediately, which costs nothing and provides certain information: whether the borrower is using a Mac or a PC.

Outside of lending, companies can charge consumers different prices based on the type of computer they use. Orbitz has steered Mac users to more expensive hotels. Home Depot charged slightly higher prices to those using Androids.

But the laws that govern pricing for goods and services differ from those that govern lending, and FinTech lenders will have to comply with existing laws, just as regulators and judges will have to decide how to apply rules that predated the concept of algorithmic lending.

Offering different costs is called price discrimination. Price discrimination forms the basis for underwriting, which in turn is the basis for lending. The wide-spread process of charging higher interest rates to riskier borrowers, or risk-based pricing, is generally supported by business and government as the proper way to allocate credit.

However, some forms of price discrimination in lending are illegal, such as charging higher interest rates to borrowers on the basis of their race or gender. Regardless of whether any data or models show differing risk, it is illegal to change the terms of credit for these protected classes. There is a line between price discrimination and plain discrimination.

ANTI-DISCRIMINATION LAWS FOR LENDING

The laws governing discrimination for the provision of credit are different than those governing products or even other financial services. Many think of gender in a similar manner as race — that is, that you can’t discriminate against people on either basis. However, we allow price discrimination on gender all the time when it comes to auto insurance. As anyone with teenage children can attest, insurance companies charge young male drivers higher premiums than their female counterparts.

The criteria in lending differ substantially — and there’s good reasons why, including the concept of disparate impact arising from the Fair Housing Act of 1968 and the Equal Credit Opportunity Act of 1975 (ECOA). Society decided to outlaw differing prices and availability for credit for a variety of factors, regardless of whether they interact with risk based pricing. ECOA:

“Prohibits creditors from discriminating against credit applicants on the basis of race, color, religion, national origin, sex, marital status, age, because an applicant receives income from a public assistance program, or because an applicant has in good faith exercised any right under the Consumer Credit Protection Act.”

Thus, government makes clear the set of criteria which are out of bounds. These criteria are often those that are subject to claims of discrimination in other contexts, but also specifically include a prohibition against treating income from government programs or exercising consumer rights from being used in the credit calculation. Note that many of these factors can be used in other forms of pricing — age for health insurance, sex for auto insurance, source of income for whether a store accepts food stamps.

One reason why lending is treated differently is that the provision of credit can and has been used to segregate neighborhoods through mortgages and impede the formation of businesses owned by people of color, through business and other forms of lending. This can occur through both reducing or restricting access to credit (loan denial) and by imposing higher costs (fees, interest rate) on a targeted group. The very term redlining has its roots in race-based discriminatory lending. Unfortunately, the history of racial discrimination in lending is long and ongoing. During the recent financial crisis, certain subprime and predatory loans and products were targeted to minorities and people of identical economic situations were steered to different products on the basis of their race. Government policies have not always aimed to combat this. Federal credit programs actively promoted racial segregation for decades including through the 1960s. The need for clear and enforceable laws and judicial review of both private and government policy is well founded.

Lenders use various types of data in determining borrower risk, which in turn drives the interest rate. Though some of that data, such as income and assets, is correlated with race, it is still permitted as it is able to predict repayment. Attempts use proxy statistics to try to price discriminate based on race or any other impermissible factor are not allowed.

Lenders cannot use data that are solely correlated with race and are not predictive of repayment. Data used must have a legitimate business necessity. For data that are correlated with both race and repayment, the goal is to find the best data that captures repayment with the minimum correlation to race. As the Federal Deposit Insurance Corporation (FDIC) states: “If a policy or practice that has a disparate impact on a prohibited basis can be justified by a business necessity, it still may be discriminatory if an alternative policy or practice could serve the same purpose with less discriminatory effect.”

Disparate impact is a key term and was the subject of a recent Supreme Court case, in which the Court upheld in a 5–4 decision that price discrimination criteria can have a disparate impact, and hence be impermissible, even if there was no intent on the part of the lender to discriminate on the basis of race. However, the Court maintained that there must be more than just a racial disparity.

BACK TO THE QUESTION OF A BORROWER WITH A MAC VS. ONE WITH A PC

As of today, the question of whether charging differential interest rates to Mac or PC users has no clear answer. If the brand of computer someone uses is correlated with race and repayment, is there another data point a lender could use that provides similar information in a less discriminatory way? If not, is it enough of a legitimate business purpose to justify charging consumers different rates? Is there enough of a racial disparity to trigger a disparate impact finding?

Absent a litigated case, it is hard to know. Given the uncertainty surrounding the issue, and the potential that a lender who does this and is challenged will face legal and reputational costs in excess of realized gains, it is likely that many lenders will simply not use this data. Of course newer lenders — such as FinTech firms who face lower reputational costs, who believe they are less likely to face scrutiny, or who simply lack knowledge of the legal ambiguity — may employ this tactic.

On to the key question: Should this be allowed? One way to answer the question is to consider what happens if it is not allowed. If lenders are not allowed to use information about the type of computer a person owns when setting rates, Mac and PC users would be offered the same rate. Ultimately, this means Mac users would cross-subsidize PC users, even though Mac users are better credit risks and some would argue they deserve a discount. The blended rate faced by everyone is slightly higher than Mac users would face and slightly lower than what PC users would face. Overall, lending terms would be more equal but less specifically risk-based.

Another way to answer the question is to ask is there a racial underpinning between Mac and non-Mac users. My own opinion is that while this path may be enticing, but it is fraught with danger and should be rejected. To me, there is nothing ‘racial’ about the decision to have a Mac or not. To the extent there are racial brand issues, Apple seems to have pursued a multi-ethnic appeal to minorities with much of its advertising. The reason why this approach should be rejected is in the subjectivity of that last sentence. Financial regulators should not be put in the place of opining on the racial characteristics of users of major product brands. Doing so invites a world of mistakes, litigation, and arbitrary decisions. Do we want a government list of products that have too high a racial correlation (certain beauty products or magazines), while others are considered to be officially ‘race-neutral’?

QUESTIONS FOR REGULATORS MOVING FORWARD

Innovation and competition are factors to be embraced in financial services regulation. FinTech’s rapid expansion is based on these pillars. Eventually, FinTech companies will realize that there is an arbitrage potential to profit from this difference, or similar differences. They may do so directly through pricing, indirectly through changing their advertising or customer acquisition strategy, or even accidentally, by creating an API that self-teaches or learns these facts and acts on its own.

When this happens, policy makers will be forced to respond. They should begin thinking about these questions today, wondering how to best apply anti-discrimination principles, laws, and regulations into the FinTech world. Congress should task the GAO with analyzing the issue, building upon the recent GAO FinTech study, which covered much of the Fintech space but did not delve into this area.

Financial regulators, particularly the OCC, which has taken a leading role with its so-called FinTech charter proposal, should produce working papers building on real-life examples such as the Mac vs. PC question, to explore these issues. Public comments should be requested by all parties, with a focus on engaging traditional banks, FinTech companies and consumer advocates to learn more from people in the trenches producing new lending products and tackling racial discrimination in lending. Having an open and honest conversation about race, lending and financial technology is not easy, but it is necessary if we are to harness the best that FinTech has to offer.

YOUR OPINION?

With this background (and some of my opinions) how would you decide this question: Should it be legal to offer borrowers different interest rates solely on the basis of whether they are using a Mac or PC? Click yes, no, or still unsure on the poll below to cast your vote.

[1] These two assumptions have been stated and verbally confirmed by multiple financial institutions and consumer advocates that I have spoken with. I have never heard anyone contest these assumptions. I have not been able to find data that control for income. Yet, the thought experiment provides value even if these specific assumptions are violated as one could substitute others (e.g. state of birth, race, and likelihood of loan repayment). For a longer list of how Mac and PC users differ on a variety of traits see: https://flowingdata.com/2011/04/26/mac-vs-pc-people/.

--

--