Social Credit Scores: Detailed Analysis

Jordan Issaakidis
6 min readSep 5, 2019

Trust Me If You Can: Black Mirror for Banks.

Go Back to the Story.

We used our Responsible Banking Framework to perform a high level analysis of this scenario.

A sample of our analysis is below.

Our analysis

Figure 1: Responsible Banking Analysis Framework, Trust Me If You Can (2019)

Opinion

Data from different devices can paint a complete and intimate picture of people’s activities, behaviors, hidden interests and routines. The surveillance mechanism that powers the new era of ‘big data’ for deep learning and AI has, to some extent, taken away a user’s privacy without their explicit consent.

This is happening every time we are digitally active, such as when we search for locations of restaurants, make bookings for movies through the internet, buy online and use “free” applications of any type.

By understanding the impact of AI and data governance, we can begin to understand our wider human rights in the digital era, and hopefully reinstate them.

Social media shouldn’t be used to fully automate credit decision-making. It could be used to augment human decision-making so long as customers are made to feel in control e.g. must gain explicit consent; customers have the right to see how their credit score is calculated; and customers are given the right to ‘contest’ automated decisions to have it reviewed. Investment in a PR campaign, security and data access controls would be money well spent.

Opportunities

Figure 2: Opportunities at Stake, Responsible Data Framework, Trust Me If You Can (2019)
  • Margin: Polidor charges a risk premium for riskier loan applicants, thereby protecting overall margins.
  • New or Increased Revenue: Polidor will enter into new markets starting in South East Asia, then India, then Latin America and Africa.
  • Conversion Rate: By understanding risk better than other banks, Polidor is able to make more attractive offers to lower risk customers, thereby increasing conversion rates.
  • Social good: Polidor argues that they are making it fairer for everyone: “why should people with sound financial attributes subsidise those that are less frugal?” Further, Polidor believes it is opening up the financial system to those that have never had access to it before.
  • Marketing Opportunities: Polidor is able to market this as a “feel good” exercise to generate publicity.
  • Alignment with Stakeholder Values: Polidor is aligning itself with supranationals like the UN, sophisticated and benevolent investors, and possibly able to leverage those relationships (e.g. to identify new customer bases through existing databases held by charity organisations).

Technology Trends and Data Used

Potential Risks or Harm

  • Price discrimination: Is it OK for banks to offer different rates to different people based on their lifestyle choices?
  • Loss of anonymity: Do people have a right to “be themselves” on social media without worrying about the financial consequences?
  • Social equity: Polidor argues that they are making it fairer for everyone: “why should people with sound financial attributes subsidize those that are less frugal?” Additionally, Polidor says they are aligning their corporate objectives to the UN SDGs, and therefore they are benevolent in nature.
  • Vulnerability to evil: What if the data fell into the wrong hands? For example, could evil agents use it to threaten customers with random? What happens if a location tracked can be hacked, and those wearing it are tracked?
  • Data Theft: Doesn’t social media belong to me? What right do banks have to use this data?
  • Identity Theft: What happens if someone creates a social media profile with my details and accrues debt by fooling Polidor’s AI?
  • Right to change: If my social media history contains things that I have done in the past but no longer do in the present, is it OK for Polidor to continue to judge me on past behavior?
  • Inauthenticity: Are Polidor’s stakeholders convinced that Polidor is truly entering to banking the unbanked of the world for benevolent reasons, or is it possible Polidor will be seen as being fake and inauthentic, and just another company trying to maximise profits? This could undermine Polidor’s brand equity.
  • Risks: Polidor is exposing itself to new regulatory risks as a result of entering into new markets. How will it identify and monitor it is in compliance with these rules?

Strategies, Principles and Controls

  • Consent: Polidor can minimize ill-effects by first seeking consent from its customers. But what if customers don’t give Polidor consent? Should these customers be disqualified to apply for a loan? Is consent true consent, if it is given in dire circumstances, where customers do not have any other options?
  • Explainable: Under the GDPR, customers have the right to receive an explanation of automated credit decisions. How would Polidor achieve this without giving away commercial sensitive information.
  • Accuracy: This is perhaps the greatest potential flaw in the product. Social media is patchy — some users are prolific, whilst others are non-existent. Social media data can be easily manipulated. Two strategies to consider (1) use Social Credit Scoring data as an input to augment human decision-making (rather than full automation) and (2) train AI to recognise and report flows in the data (e.g. a low confidence rating if the data is considered unreliable).
  • Process and Governance: Polidor should ensure that it follows a concrete governance and legal process to ensure it states compliant with all applicable regulations. Entering new markets is not easy for regulated financial institutions, and this may expose Polidor to significant penalties if not done correctly.
  • Data ownership: What if Polidor were to give customers the right to view their profile / Social Credit Score? Would this give them a greater sense of control?
  • Power & privilege: What if customers were given the right to “contest” decisions and request a human review?
  • Benefits sharing: What if Polidor offset the potential harms by doing some social good, such as helping people with financial literacy or allocating some of the gains to offer microloans to the less advantaged?
  • Social norms: Should Polidor have run focus groups to test the social acceptability of Social Credit Scoring? And should Polidor take the path of gaining support from regulators and other influencers or should Polidor disrupt the market like Uber did?
  • Storage limitation: To overcome security concerns (i.e. data falling into the wrong hands), Polidor’s AI could generate the Social Credit Score on an as-needed basis and not store this data (i.e. it vanishes) after the credit decision has been made. Or at the very minimum, the data could be stored as an encrypted form.

End of the sample analysis.

Go Back to the Story.

--

--