Persuasive Machines: Weapons of Mass Disinformation

Carlos E. Perez
Intuition Machine
Published in
6 min readNov 16, 2016

--

Credit: https://www.flickr.com/photos/jayt74/

On July 28th, 2015 a group of AI scientists published an open letter concerning the creation of weaponized AI.

They write the following:

The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.

Which lead to some research:

Unfortunately, by November 2016 a different kind of weaponized AI had arrived with devastating effects:

This contact is important in the context of our social channels. They are designed to let us insulate ourselves from the people and opinions we would prefer not to see.

source: https://medium.com/@tobiasrose/empathy-to-democracy-b7f04ab57eee#.k82dvd4uq

Facebook’s role in providing Americans with political news has never been stronger — or more controversial. Scholars worry that the social network can create “echo chambers,” where users see posts only from like-minded friends and media sources.

These “filter bubbles” or “echo chambers” are primed staging areas for further exploitation. Persuasive Machines are AI driven automation, leveraging knowledge of our cognitive biases:

to hack into our thought processes, leading to persuasion (or alternatively mind control). A example of this exploit is the rampant rise of fake news and events on Facebook. Mike Caulfied write “Despite Zuckerberg’s Protests, Fake News Does Better on Facebook Than Real News. Here’s Data to Prove It.”:

source: https://hapgood.us/2016/11/13/fake-news-does-better-on-facebook-than-real-news

Fake news is orders of magnitude more popular than real news. It is not just fake news that’s a problem, security breaches are another. Maria Krolov writes:

“It’s a combination of a lot of things that we’ve seen for a lot of years coming together,” said Ric Messier, head of the cybersecurity program at Burlington, Vt.,-based Champlain College. “The fact that it’s so easy to do this leaking and be able to manipulate people in this way certainly suggests that we’re probably just starting to see the beginning of these sorts of activities or attacks.”

Automation in the form of Twitter bots have amplified messaging to bring about a ‘bandwagon effect’ to influence the masses:

Analyzing Twitter during three televised debates, they discovered that 20% of all political tweets were made by bots. … These bots, they wrote, can make online conversations more polarized. They make it easier to spread factually incorrect news stories. And they are easy to make: Nearly anyone “could obtain the operational capabilities and technical tools to deploy armies of social bots and affect the directions of online political conversation.”

These automaton, in combination with massive hacking and phishing attacks by a nation state were part of a coordinate effort to undermine the decision making of the population. Esquire magazine has an even more detailed account of this in “How Russian Pulled the Greatest Election Hack in History”.

Jonathan Albright has an even more detail analysis of what he calls: “Micro Propaganda Machines”:

source: https://medium.com/@d1gi/the-election2016-micro-propaganda-machine-383449cc1fba#.m5eco2oay

He writes:

There’s a vast network of dubious “news” sites. Most are simple in design, and many appear to be made from the same web templates. These sites have created an ecosystem of real-time propaganda: they include “viral” hoax engines that can instantly shape public opinion through mass “reaction” to serious political topics and news events. This network is triggered on-demand to spread false, hyper-biased, and politically-loaded information.

Scott Adams (of Dilbert fame) understands the power of persuasion quite well:

Adams, like Trump, recognized that the election would play out at the limbic level of primal furies and genital anxieties.

Dylan Love writes about solutions to solving the AI arms race in “The Next Global Arms Race Aims to Perfect AI”:

  1. Make AI development illegal, but history suggests that prohibition doesn’t work.
  2. Win the AI race.
  3. Assemble an international AI consortium, with many nations pooling their resources.
  4. Every nation unites under one flag and one leadership.

Not a very promising list of options.

More Reading:

https://www.wired.com/2016/09/inside-googles-internet-justice-league-ai-powered-war-trolls Inside Google’s Internet Justice League and its AI Powered War on Trolls

http://kioski.yle.fi/omat/at-the-origins-of-russian-propaganda Yle Kioski Traces the Origins of Russian Social Media Propaganda — Never-before-seen Material from the Troll Factory

http://behavioralscientist.org/scaling-nudges-machine-learning/?lipi=urn%3Ali%3Apage%3Ad_flagship3_feed%3BH1Ko8u8ZSO6ystU7DysQzQ%3D%3D

http://www.niemanlab.org/2016/11/the-forces-that-drove-this-elections-media-failure-are-likely-to-get-worse/

https://arxiv.org/abs/1603.07954 Improving Information Extraction by Acquiring External Evidence with Reinforcement Learning

http://hazyresearch.github.io/snorkel/blog/slimfast.html SLiMFast:
Assessing the Reliability of Data

https://www.washingtonpost.com/amphtml/news/theworldpost/wp/2017/10/09/pierre-omidyar-6-ways-social-media-has-become-a-direct-threat-to-democracy/

For more on this, read “The Deep Learning Playbook

--

--