Facebook and Cambridge Analytica — “Know Your Customer”, A Higher Standard
Know Your Customer (KYC) is a required practice in finance. Defined as the process of a business identifying and verifying the identity of its clients. The term is also used to refer to the bank and anti-money laundering regulations which governs these activities. Many of you will not be familiar with this rule of law. It exists primarily in the financial industry and is a cousin to laws such as Anti-Money Laundering (AML) and the Patriot Act of 2001 and US Freedom Act of 2015. These laws were designed to require companies to examine who their clients were. Are they involved in illegal activities? Do they finance terrorism? What is the source of these monies? Does the customer engage in illegal activity? The idea was to prevent our financial industry from supporting or further the ability of wrong-doers to cause harm. So how does this apply to Facebook and the Cambridge Analytica issues?
I am suggesting that the Data Industry, which includes any company that sells or provides access to individual information should be held to this same standard. Facebook should have to Know Your Customer. Google should have to Know Your Customer. Doesn’t this seem reasonable? The nice part about this proposal is that it isn’t new. We don’t have to draft brand new laws to cover it, rather just modify some existing language. KYC exists for banks, now let’s expand it to social media, search engines and the sellers of big data.
Everywhere in the news today, we have questions about “who is buying ads on social media”? Was it Russians trying to influence an election? Was it neo-nazis, ANTIFA or other radical idealogues? Is it a purveyor of “fake news”? If social media outlets were required to KYC their potential clients then they will be able to weed out many of these organizations well before their propoganda reaches the eyes of their subscribers. Facebook has already stated that they want to avoid allowing groups such as these to influence their users via their platform. So it is highly reasonable to ask them to do it, or be penalized for failure to do so. Accountability is a powerful thing. Accountability means that it actuals gets done.
Speaking of “getting it done”, some of you may have seen Facebook’s complete about-face on its compliance with GDPR, moving 1.5 billion users out of Irish jurisdiction and to California where there are very limited legal restricitons. https://arstechnica.com/tech-policy/2018/04/facebook-removes-1-5-billion-users-from-protection-of-eu-privacy-law/
If you aren’t familiar with GDPR, it is Europe’s powerful new privacy law. For months, Facebook has publically stated how it would intend to comply with the law. But when push came to shove, their most recent move is to avoid the law and avoid compliance as much as possible. So flowery-language is something we often here from corporate executives on these matters, but in the end, they will still serve shareholders and profits first and foremost. So unless, these companies are forced to comply, don’t expect them to do it out of moral compunction, that’s rarely how companies operate.
Returning the the practical application of KYC, for a financial firm, this means that a salesperson has to have a reasonable relationship with their client, in order to assure that they are compliant with KYC. They need to know the client personally and be familiar with the source and usage of funds. If a financial firm fails to execute KYC and it turns out that the organization they are doing business with is charged with a crime, then the financial firm and the individuals involved would find swift ramifications, including substantive fines and potential jail time. This should apply to social media and the data industry.
Let me give you a nasty example. Have you looked at the amazing detail Facebook or Google have compiled about you? It is fairly terrifying and there are some out there (Gartner, for example) who have even predicted that your devices will know you better than your family knows you by 2022.
https://www.gartner.com/smarterwithgartner/emotion-ai-will-personalize-interactions/
Now assuming this is even close to true for many of us, then imagine where that information is sold to a PhD candidate at MIT, or other reputable AI program, except that PhD student, beyond doing his AI research is funnelling that data on to hackers on the dark web, or worse, to a nation-state engaged in cyberwarfare. How easy would it be for that group to cripple a large portion of the country? Or maybe, it has already happened, with examples like Equifax and its 143 million client breach. Can you be sure that the world’s largest hacks aren’t getting their start by accessing your data from a data reseller?
To be fair, in finance, often times you are taking in the funds and controlling the activities after the fact. You know what is going on. With data, often times you are selling access or actual data to the customer and no longer have control over their activities, it might seem. But this argument simply enhances my interest in Know Your Customer, because these firms may have little idea how this data is being used or misused. Better to slow down the gravvy train than to ride it into oblivion.
Obivously the details would need to be drafted and hammered out by Congress, but I am seeking support of the broader concept and encouraging supporters to add it to the legislative agenda. ForHumanity has a fairly comprehensive set of legislative proposals at this point which we would hope would be consider in the broad concept of AI policy. Questions, thoughts and comments are always welcome. This field of AI safety remains so new that we really should have a crowd-sourced approach to identify best-practices. We welcome you to join us in the process.