Hacking the Attention Economy, Amplifying Discord and Hate
Code Societies 2018 , Day 3, with danah boyd
Blog post written by stud1nt
danah boyd is a Principal Researcher at Microsoft Research, founder of the research institute Data & Society, and a visiting professor at New York University’s Interactive Telecommunications program. Her work broadly covers the intersections between technology and society.
danah begins our class explaining that by living and working inside the tech industry, she can see where and how it can go wrong on the outside.
Part One: Algorithmic Influence
Part one of her presentation covers algorithmic influence. danah talks about how there was a period in which “big data” was a buzzword but fell out of favor about 4–5 years ago as it increasingly became associated with big brother and surveillance. In turn, the new buzzword became “AI”, but danah points out that AI reasserts & reinforces specific forces of power of the current moment.
danah shares the work of Latanya Sweeney, a researcher and former chief technologist for the FTC, whose research shows that Google searches for names associated with blackness are more likely to have ads related to criminal justice than names associated with whiteness. danah points out that Google actually learned these prejudices from a specifically American context and American citizens’ clicks!
danah shares another example of discrimination, categorization without a theory of power, in SEO. She does a Google search for the word baby and the results return a noticeable homogeneity of stock photos featuring perfectly cute, white babies. She asks us to consider why this is overarchingly the case in English and other languages. We learn that stock photo companies use metadata and tagging to track photos used in presentation decks.
Then, danah puts us in the driver’s seat and asks us how we would combat the “baby” problem? After considering the multiplicity of factors that converge in SEO, we realize we have a networked accountability problem. We ask: “which images sell better? Who defines representation? Where does responsibility lie? How do we untangle the biases that are formalized through technological processes?” More questions than answers!
Part Two: Hacking the Attention Economy
In part two, we discuss how attention on the internet can be controlled and manipulated for the worse by groups who strategically trick the media into amplifying their message. We discuss paranoia about sexual predation in 2008 (See: To Catch a Predator) and the emergence of 4Chan, the culture of trolling for the sake of it, and the “Internet Hate Machine”.
danah refers us to Whitney Phillip’s book This Is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture and her article for Data & Society “The Oxygen of Amplification: Better Practices for Reporting on Far Right Extremists, Antagonists, and Manipulators”
In all of this, the lesson is clear: we have to understand and then hack the attention economy in order to change the broader media landscape.
Part Three: Finding the Red Pill
danah invites us to explore how people are invited to destabilize the world that they know and look for alternative explanations. She refers to Neo’s choice in the 1999 classic The Matrix between the blue pill (back to regular reality) and the red pill (falling down the deep, dark rabbit hole).
In a society where people have become increasingly, if not completely, disconnected and mistrustful of conventional media (“the news”), it is easy to use the media against itself to radicalize people. danah describes this “boomerang effect” where the news negates a topic or idea and the public is more likely to believe it. In addition, “Red pills” are used across media from Wikipedia to radio, encouraging everyday people and journalists to search for specific terms which have been optimized to lead to destabilizing alternative sources of information. For example, the hashtag #crisisactor was planted in hundreds of spread in the wake of the Parkland, FL school shooting, falsely alleging that survivors of the shooting were paid actors.
We discuss how fundamentally destructive these tactics can be in terms of radicalized violence. Dylann Roof, the white supremacist and mass murderer convicted for the Charleston church shooting, did a search for Trayvon Martin and George Zimmerman and found the term “black on white crime” on the case’s Wikipedia page. This “red pill” fostered his radicalization.
In an effort to make sense of how things are amplified, we consider how news gets made or is produced. For instance, Sutherland Springs didn’t have many searches before the shooting occurred. When companies identified the spike in searches, they used Reddit & Twitter to gather information. Terms that have been SEO optimized to convey a particular message can trick journalists into reporting an agenda rather than the truth. So, who is manipulating search terms? It turns out that foreign governments and the ideologically motivated can share the same tactics without sharing an end goal.
danah brings up the fact that Youtube is the primary search method for people under 20. She asks us to do a Youtube search for the terms “social justice”, “intersectionality”, and “how to get laid”. The top results come from Prager University, an evangelical non-accredited institution. The videos are extremely high quality and present insidious, “redpill” messages meant to activate and radicalize people with power by convincing them that they are powerless.
How do we work against “redpilling” and media manipulation? We cannot just recreate and renarrate. We have to combat SEO, too!
Part Four: The Consequences of Amplification
What is the cost of amplifying destructive messages? Amplification of fear functions like a game, eroding trust in public institutions and disseminating distorted information. We consider two examples in class: suicide and the 2020 census.
We already know the effect of publicizing suicide in graphic detail on the population: it increases rates of copycats. Yet, shows like 13 Reasons Why still exist and the media continues to report celebrity suicides in a way that seems to highlight rather than obfuscate the destructive details.
With regards to the 2020 census, danah asks us to think about whether or not we should encourage self-reporting in the wake of the Trump administration and ICE. She points out that the government does not need the census to identify undocumented folks. Cell phone records could do that much more effectively, and spinning the census into a threat simply breeds false fear.
Part Five: Epistemological Warfare
danah asks us to reflect on how we think about the production of knowledge as a field. She refers to Francesca Tripodi’s “Search for Alternative Facts: Analyzing Scriptural Inference in Conservative News Practices” to show that how people construct reality is both culturally specific and rooted. In this study, Tripodi looks at the history and practice of bible study as an exercise in hermeneutics that extends beyond the explicitly religious to the constitution, the tax reform bill, and the search results of google.
danah prompts us to think about who designs and architects cultural logic so that we can ask questions of Google not with Google.
“Power is in tearing human minds to pieces and putting them together again in new shapes of your own choosing.” - George Orwell
Part 6: Rebuilding the Social Fabric
danah highlights the fact that many projects across American history have focused on renetworking sociality. Paradoxically, the military became a place where the American social fabric was knit, uniting individuals with diverse backgrounds from all over the country for a common cause. Contrary to what we’re lead to believe, women entering the workforce in the 1970s changed labor as much as automation and technology. Women had previously functioned as the glue that held local social networks together, and as they left, the social fabric that held our communities together started to become unravelled.
In it’s infancy, the internet was a place that helped people deal with identity and build community. danah reflects on her experiencing making an online friend as a young adult that helped her understand gender and sexuality in a safe space. In contrast, danah calls attention to the weaknesses of Dan Savage’s “It Gets Better” campaign in which LGBTQ youth posted videos on YouTube about their experience only to receive hateful comments instead of support in response to hyper-public vulnerability.
danah asks us to think about how we can get back to holding people through the internet.
Exercise: Networked Holding
People use technology to cry out for help. To seek connection. To find support. They often self-segregate. They engage in tribal acts. How would you build networks that allow people to actively ‘hold’ others in pain?
As a class, we respond to this prompt with suggestions of using the comments section to reach out to people in distress and offer simple conversation (“Hey, do you want to talk?”) or volunteering to help someone else fill out the 2020 census on their phone.
Part Seven: Towards Accountability
Perhaps the most recurrent theme in our discussion is the issue of accountability. How do we unpack and understand the distribution responsibility in large-scale institutions? Does accountability happen at the level of the data point?
Again, the issue of criminal justice arises as one of the most relevant and polarizing issues around accountability. As more and more technology is used to produce, predict, and identify “criminals”, how can we think about these large-scale systems in terms of accountability? What is the performance metric and what is being optimized for and under what conditions? These questions make us rethink the purpose of criminal justice: is it purely deterring and punitive in line with the use of data analytics and predictive policing or is there a systems level issue that does not address the fact unequal distribution of resources and opportunity that prompts people to act and also makes them more likely to be caught?
Part Eight: Content Moderation
danah poses the challenging question of who should moderate explicit content, in particular, CSAM (criminal sexual abuse of minors). Most of this content is outsourced to the Philippines where strong religious affiliation with Catholicism purportedly facilitates this kind of mentally and morally draining work.
Is an algorithm the best “person” to moderate child pornography? If so, what are the rulesets and how do our expectations come undone? Do we prefer fewer or more false positives and false negatives? How does this relate to specific cultural contexts? Who gets to define what is hate speech? How do we deal with the fact that anytime we try to regulate something, it mutates and morphs into another thing to be detected? To make things more complicated, danah introduces the concept of the “moral crumple zone” where individuals serve as the scapegoat for culpability when algorithms and machines fail. How and where is humanity empowered versus perverted in systems of moderation?
Exercise: Stopping Hate
Again, danah puts us in a position of power to define what types of hate images and languages are forbidden, create a set of protocols for what’s acceptable and unacceptable, and describe the process by which our organization will do this “right” by workers and communities affect.
danah shares the pros and cons between an artisanal (“I know it when I see it”) vs factory model (algorithmically automated). As a group, we realize that it is particularly challenging to define what is hateful outside of our own contexts and that the onus should not necessarily be placed on those affected by bias and discrimination.
Part Nine: Reflection
To close the class, we reflect on how we as coders and users have tremendous power within the system. What does it mean to be an active agent and have agency over the way we interact with each other and these systems? In relations of power, what does healthy masculinity and healthy whiteness look like? How can we contribute proactively and productively to our network environment?