The Consciously Unconscious

After watching The Social Dilemma on Netflix, I fell into the loophole mentioned by Z.M.L. as one of tech companies’ main goals: engagement (Z.M.L. 1). I then watched the next “More Like This” recommendation, The Great Hack.

This documentary explores the Facebook-Cambridge Analytica scandal of the 2016 United States’ Presidential Election. Cambridge Analytica obtained the Facebook data profiles of millions of users without their consent, and applied that data to target individuals in hopes of influencing their political beliefs.

In the information age that we are all living, Silicon Valley’s reputation in society has recently shifted from a positive light to a negative one. Most people are now aware that social media platforms collect and harvest user data for profit, both fiscally and socially, and that is typically visible to the user in the form of advertisements. However, I was surprised to learn that Cambridge Analytica took a different approach with the data they had acquired.

This tech company targeted users’ pathos. They used this data to group people based on their personality traits, most commonly known as the The Big Five personality test. The test determines a result for each of five characteristics: openness, conscientiousness, extroversion, agreeableness, and neuroticism.

https://nigelchetty.com/new-blog/2019/8/11/the-ocean-model-of-personality

After giving each individual a rating, Cambridge Analytica created an algorithm to group homogenous people and suited specific videos and/or articles that would resonate with that user’s feelings and beliefs. For example, Cambridge Analytica’s former CEO Alexander Nix stated, “a more agreeable person may respond better to an ad that emphasizes family values” (Resnick 1). This phenomenon has been given the name, “psychographic micro-targeting”.

In the Moral Machine experiment, whatever the government, the technology company, and society agree on, is who will ultimately be killed (Hao 1). The self-driving car does not have a consciousness to make the decision, therefore it is programmed by humans to choose who will live and who will die. One cannot put blame on the car when the car was simply programmed to make that unconscious decision.

Although placing users into homogeneous groups based on personality characteristics doesn’t give the computer a conscious, the computer now has the ability to manipulate the consciousness of the user.

What happens when social media platforms begin to influence our subconscious choices? Are we, as consumers, to blame for the emotional manipulation caused by these applications?

https://www.vox.com/science-and-health/2018/3/23/17152564/cambridge-analytica-psychographic-microtargeting-what

--

--