Cambridge Analytica and the Psychology of Design
How one company has exposed the dark arts of user research and design.
The recent news about Cambridge Analytica and their efforts in mining data from Facebook users has shed a new light onto the dark side of user research and design. Apart from the continued threat of illegal third-party data theft is the more nascent threat of data misuse. At one point in time, tech companies didn’t really know what to do with the user data they held. Now we’re seeing the other side of the story, which is the fact that companies know exactly what to do with their big data and they are using it to both to our pleasant surprise and ultimate demise. On the more benevolent side, Google is using their data to give you a better search experience and Amazon is using it to help coordinate more efficient deliveries. On the darker side is the evidence of using data and design to lure users into believing and rallying behind false statements. Through user research and the basic concepts of design and psychology, these companies are creating digital experiences to make you take an action like buying a new product or attending an event. Since they already know so much about your digital profile, their strategic approach is incredibly hard to resist.
Here lies the dark side of user research and design. We are now so well known by these big tech companies that they have us in their pockets. What we’ve seen from Cambridge Analytica is as much of a wake-up call to every single internet user as it is to Facebook themselves to stop this from ever happening again. What Analytica has done is not much different than what all of these other companies are doing, it’s just that they’ve used the information for malicious purposes. Some might even argue that the targeted ads we receive on Facebook itself are a breach of privacy, preying upon uninformed users. Analytica is using their swaths of data that tells them about human behavior on the internet in combination with their design and psychology expertise to distribute content that plays into the hopes and fears of a targeted population.
The way Analytica uses data to play into our fears is similar to the way Facebook uses data to serve ads on your newsfeed for the hot new shoes you’ve been looking for after monitoring your internet traffic and conversations. The way that Analytica uses psychology to direct our attention is not far from the way Amazon uses psychology to get you to buy a new rice cooker by showing that there are only 4 items left in stock. Sometimes the line between psychological civil engineering to get you to buy something isn’t far from psychological warfare to convince you to become a white nationalist. In this particular case the psychological nature of Analytica’s tactics displays a complete disregard for morally principled research and ethical design.
After collecting demographic, psychological and behavioral data, ensnaring the targeted users’ emotions is easy and the likelihood of success greatly increases. Further to the point of creating successful coercing advertising campaigns is the ability for an organization like Analytica to measure each attempt through data analytics so that they can find out how engaged users are with the content and how willing they are to share that story. Through strategic messaging that plays to our emotions and subconscious thoughts, Analytica has tried to manipulate our entire world perspective and internal philosophy. The scariest part about all of this is that much of what Analaytica has been doing is legal. Creating propaganda messages and photographs is not against the law unless it is something personally defamatory or in rare cases that it incites violence.
The First Amendment addressing Americans’ free speech rights means we can exchange ideas, even false ideas, freely. The fear is that censoring fake news might trigger similar censoring to other news that people simply do not agree with. There are resources like PolitiFact.com to help discover fake news, but the responsibility is largely on the reader and our government, which is doing little to help in this regard. The United States legal system is far behind the curve in preventing our data from being used for nefarious purposes. Perhaps it should be illegal for companies to ascertain and use certain information beyond our health records to target consumers. Maybe our purchasing behavior could be blocked by default and allow us to opt-in instead opt-out. It’s clear there is a lot of work to be done to not only help secure our data, but also to assure that it is not being used maliciously.
One of the malicious acts that have been brought to light through The Guardian’s story on Analytica is the psychographic profiling of users made possible through the illicit acquisition of Facebook users’ data (the legality of this acquisition is another story). Using large data sets, Analytica could define how a person acts like within their digital ecosystem by what they click on or share with others to help figure out more a strategic approach for targeting a specific user. With data, they can figure out how likely an anxious conspiracy theorist is to believe a fake and controversial and story and which stories have the greatest effect on changing their behavior. The company’s main tagline is “Cambridge Analytica uses data to change audience behavior.” I don’t know about you, but that scares the sh** out of me.
The psychology of design is nothing new. Human factors engineers study the art of design and human psychology to create more intuitive and useful products. Using physiological response rates, movement dynamics and other processes, these scientists can determine how to optimize functionality. Designers can use tactics such as motion design or adjusting the object’s color and size to divert our attention to the appropriate place on the screen. Blending the concepts of design and psychology in an optimal way is how we create products that bring us delight. Unfortunately for the consumer these tactics are not always meant to make us more productive and happy individuals. Sometimes these tactics are used to show photos that raise our fear of immigration or falsely convince us that a politician has broken a law. The trend towards the dark side of user experience is rapidly growing alongside another trend in the opposite direction towards ethical design. We can only hope that good will prevail, but the mounting evidence against a more ethical future is proving otherwise.
We need more whistle blowers like ex Cambridge Analytica contractor Chris Wylie to break the silence on these evil ways and more news agencies like The Guardian to cover them. We need more ethical designers, researchers and engineers to protest against the dark arts and to help create design guidelines with a moral perspective. We need psychologists and behavioral scientists to define the mental and physical human limitations that those moral guidelines should adhere to. At the top of this chain we need our government and politicians to create laws to protect our data and prevent the misuse of our personal information. Our data should be used to unite our nation, not divide it. Our data should be used to help solve problems, not create new ones.