In Defense of China’s Social Credit System

Asinus
6 min readOct 19, 2019

--

From the South China Morning Post:

Foreign companies that already encounter difficulties in doing business in China are about to face an even starker reality as Beijing steps up plans for a corporate rating system.

In an ambitious undertaking, the Chinese government is building a so-called social credit system that aims to collect and analyse information on its 1.4 billion citizens and rate millions of corporations both domestic and foreign.

Its goal is to keep local governments, businesses and people in compliance with national directives. For international businesses, the programme will look at a host of data including business contracts, social responsibility, regulatory compliance and how many Communist Party members they employ.

Through a centralised platform using artificial intelligence, the system will rate firms for “credibility” or “sincerity”. Blacklisted companies could face punishments that include being denied access to cheap loans, higher import and export taxes and key personnel being prohibited from leaving China.

Some observations:

I.

Despite the name, the Post is actually headquartered in Hong Kong, which explains the article’s animus towards mainland China’s national security policies.

II.

Though it hasn’t been formally launched, the system has already been used to force international firms to adopt Chinese values on politically sensitive issues.

‘Chinese values’, as opposed to ‘American values’, presumably. Or ‘Hong Kong values’.

Here’s a hypothesis: the phrase ‘Chinese Social Credit System’ is a cause for concern in Western consciousness mostly because of the ‘Chinese’ part, and not mostly because of the ‘Social Credit System’ part. China doesn’t exactly have the world’s greatest track record with regards to human rights, especially lately: cracking down on Hong Kong protesters, allegations of Uyghur internment camps in Xinjiang, etc.

Test that theory: does the phrase ‘Canadian Social Credit System’ invoke similar feelings of revulsion?

III.

We already have a social credit system in the USA: credit scores.

No, credit scores don’t directly gauge interpersonal actions, but financial misconduct, reported to the credit bureaus, is a fairly adequate proxy for social misbehavior. Have you been divorced twice in the past six years? Your credit card’s overdue five-figure balance stems from a transaction history full of lawyer’s bills. Reported. Do you abuse drugs? Your rent has been paid on time once in the past eighteen months. Reported.

And the penalties of a low credit score extend to your social life: a low credit score might bar you from getting a mortgage on a house in a nicer neighborhood, or keep you from a comfy job in the financial sector, or make your car insurance payments go through the roof.

Set aside the fact of China’s abysmal human-rights record; pretend this ‘social credit system’ was thought up by a 31-year-old multi-multi-millionaire software engineer from Silicon Valley. Is this idea starting to sound like less of a bad thing?

IV.

Punishments could include being denied access to cheap loans, higher import and export taxes and key personnel being prohibited from leaving China.

Three “yeah, but…”s:

First - the phrase ‘could include’ is journalistic weasel words.

Second - we currently use those sorts of punishments in the U.S. to fight financial misconduct. Did you recently go bankrupt? Say goodbye to single-digit auto loan rates. Are you behind on your taxes? No passport for you.

Third - here’s the flip-side to the social credit system punishments: if you aren’t engaging in social misbehavior, your social life actually gets better.

Pretend you’re a personal banker living in the 1950s, and you’ve got five potential mortgage loan applicants to screen. You interview all five, they all seem like decent enough people, so you approve them all for mortgages at 10% APR. A few years down the line, one of your customers goes bankrupt; the payments stop coming through, you take the customer to court and bankroll your lawyer with the money from your other four customers.

Fast forward to the present and imagine the same scenario. Five people apply for a mortgage. One is immediately rejected by an algorithm due to a history of late payments and a three-year-old bankruptcy. The other four are approved. You help the four reliable customers fill out the paperwork. All four reliable customers pay their loans on time. The rejected applicant begins attending Dave Ramsey seminars.

Look at this graph of historical fixed mortgage rates from 1971 to 2016:

FICO was invented in 1989 — right before the fourth blue line in the chart. Ignoring the inflation-fueled double-digit rates of the 1980s, the chart shows a pretty clear downward trend from 1990 onwards. As the algorithms get better at weeding out high-risk credit applicants, the overall interest rates for the compliant majority go down.

And it’s not just mortgages. An underrated aspect of Uber’s business model is the rating system: you can rate your driver, your driver has to rate you. Did you throw up in your 2:30AM ride back from the bar? One star. Did you make a joke that made your female passenger feel uncomfortable? One star. Get enough low ratings, and you’re banned from Uber. One reason why Uber is cheaper than taxis: their social credit system.

In any given community, ten percent of the people cause ninety percent of the problems; identify and quarantine the ten percent, and your community gets a whole lot better.

V.

“But Silicon Valley will never be able to take the social data of millions of Americans!” No, Silicon Valley won’t take it — Americans will give it up, freely.

Imagine the future: for the benefit of their new dating service, Facebook announces a ‘Certified Kind’ profile badge. To get the badge, you allow Facebook’s AI to read through your posts, likes, comments and messages and look for problematic content. Did you call a girl names after she rejected your advances? Do you use racial slurs on a daily basis despite having the ancestral background of a Scottish peasant? No badge for you.

The badge, of course, uses data only on a pseudonymous basis — the test is passed or failed, the badge is displayed or it isn’t; if the test is passed, the badge is displayed, and if the test is failed, no one — save the unlucky profile owner — is the wiser. (Side note: Americans are happy to hand over their personal data as long as that data isn’t being read by an actual human being. Celebrities have a few private pictures leaked and controversies ensue, but Paul Nakasone and Mark Zuckerberg have servers with enough data to describe your personality and manipulate your actions better than your mother-in-law can and no one cares.)

At first, the badge is just a novelty — equal parts virtue-signalling and rudimentary Myers-Briggs-style personality analysis. And then, it starts catching on.

Facebook opens up their API to developers — in addition to Facebook’s badge, developers can deploy their own badges. Interested in running for local office? Political parties create their own badges for quantifying and advertising loyalty to their national agendas. Feeling called to the seminary? Vocations directors use the API to screen applicants for indicators of deviant behavior and heretical ideologies. Other websites begin to deploy similar badges; LinkedIn offers a proficiency badge in professional behavior, and Match Group develops a optionally-publishable rank that displays what sort of intimate partner you’ve been across all their platforms. HR professionals surreptitiously snoop for and record the badge status of job applicants. Tinder profiles admonish their badge-less readers to swipe left.

Soon, a rudimentary score starts taking shape: a pile of social-media, text-message and otherwise-obtained interpersonal data, pseudonymously aggregated by algorithms into a single score, decried by left-libertarian Congressional wannabes and apocalyptic baby-boomer preachers alike as a sign of the end times and the coming of the New World Order. Scores range from 400–900; good people have scores above 700, great people have scores above 800. Below 550? Good luck getting invited to the office’s 6PM happy hour — that is, if you can get hired on in the first place.

Of course, you can always choose not to participate. Don’t allow anyone access to your social data, don’t give away any information that isn’t legally required to be provided, don’t worry about your social credit score. “People got along fine before social credit scores came along — I’ll get along fine nowadays without one!” True. There are people nowadays who don’t have FICO scores, either…

--

--