1. Credit explained: how it all began
Welcome to chapter one in our mini-series: credit explained. Having existed in civilisations from as early as records began, we explain the origins of this age-old concept.
Defined as ‘a method for paying for goods or services at a later time’, the inception of credit is thought to date back to 3500 BC and the ancient city of Sumer (where it is believed credit was offered as agricultural loans between individuals).
The concept of credit reporting however came from Britain: where it is said a group of nineteenth-century London tailors swapped information about customers who failed to pay their debts.
It was the US who would go on to commercialise the concept at the turn of the twentieth century, allowing American families to buy Ford motorcars on credit through the General Motors Acceptance Corporation (GMAC). After cars, all manner of household goods quickly followed: from washing machines to furniture and electrics, all became swiftly available to the consumer — on credit.
The advent of automation
Initially paper records were kept to track consumer credit commitments, with agents using local newspapers, employers and court documents to update files as the situations of consumers altered.
Ultimately, if a lender wanted to get more information on an individual’s creditworthiness, they’d ask them. Ability to take on additional credit was therefore based on a series of established markers: collateral (an assessment of available assets), capacity (ability to repay debt based on current financial commitments) and character (honesty and reliability to repay debt).
But as the 1950s arrived, so too did automation. The existing reputation-based assessment moved to a quantitative approach, with computerised scoring systems and data analysis favoured over human interaction.
Greater reliance was now placed on the gathering of past credit history data, paired with a few bank held data points to arrive at a credit conclusion. This in turn paved the way for the current era of credit bureau modelling and the advent of the major credit reference agencies we know today: Experian, Equifax and TransUnion.
By 1989, the United States introduced the FICO score to standardise credit scoring across a range of factors. This scoring method remains in use today , with 90 of the 100 largest US lenders still using this score to assess credit risk.
The credit score today
With payment history, utilisation ratio and the total amount owed assessed, credit scores are not just relevant for purchasing goods, but impact a whole range of important factors in our lives: from the ability to get a mortgage, to the price of utility bills, to our chances of employment. An excellent score will open up access to the lowest possible interest rates, whereas a sliding scale of ‘good’ to ‘fair’ to ‘poor’ moves us downwards to only accessing credit on secured loans, or in some cases, not being eligible at all.
What’s more, the historical nature of credit score data limits a lender’s ability to make the credit decisions needed for our modern economy: the self-employed, the young and those who move around for travel, work or education are just some of those impacted. Multiple income streams and the rise of the gig economy are important factors in causing consumers to face exclusion and risk becoming ‘credit invisible’ as lenders struggle to score them appropriately using existing methods.
‘They [credit scores] can transform an economy — but they are only as robust as the data underlying them…’
The Economist, 6 Jul 2019
Don’t stop here — chapter two of our mini-series delves into the gaps incomplete data leaves behind in credit scoring and how that impacts all of us.
This series from Aire was taken from our complete guide to first-party data. To find out more, download your copy via our website.