A Captive and Trivial Culture: Technology Ethics in a Tech-Consumed World
In the introduction to Neil Postman’s 1985 book Amusing Ourselves to Death, Postman compared the dystopias described in two novels: George Orwell’s 1984 and Aldous Huxley’s Brave New World.
Orwell, Postman said, warned readers that they would be overcome by an externally imposed oppression. Postman says of Huxley’s vision, “No Big Brother is required to deprive people of their autonomy, maturity and history … people will come to love their oppression, to adore the technologies that undo their capacities to think.”
While Postman’s book largely concludes that Huxley was right, experts on technology and social media development indicate that the two views may not be mutually exclusive.
“ORWELL FEARED WE WOULD BECOME A CAPTIVE CULTURE. HUXLEY FEARED WE WOULD BECOME A TRIVIAL CULTURE, PREOCCUPIED WITH SOME EQUIVALENT OF THE FEELIES, THE ORGY PORGY, AND THE CENTRIFUGAL BUMBLEPUPPY.”
In many ways, contemporary culture has become captive to the manipulative design and functions of technology, created by those who did not, as Huxley suggested, fail to “take into account man’s almost infinite appetite for distractions,” but rather, have taken advantage of it.
And subsequently, individuals have been taken captive to their addiction to that technology.
Carissa Lintao is the owner of Apptuitive, an app marketing agency based outside of New York City. Working in the field of app design and development opened Lintao’s eyes to the importance of technology ethics in a largely self-serving industry.
According to Lintao, for many app developers, money is the end goal. The short and long-term psychological and social impact of their app concept is hardly on their mind.
“The people I’m dealing with right now are people that are like, ‘I have a billion-dollar app idea, here’s $30,000, I don’t care what it does to anyone or the social ramifications,’” Lintao said. “People are building social media platforms to monetize our attention and us. That’s basically it, when it comes down to it.”
And since attention can be monetized through technology, it is often manipulated through technology.
Tristan Harris, an ex-design ethicist at Google, said in a May 2017 interview on the Rubin Report that every single app, media site, news site and podcast is competing for attention, and many design and marketing decisions are made without ethical consideration of the product or platform’s ultimate impact.
“The best way (to compete) is to actually play … magic tricks on people’s minds and manipulate them so that you can get as much attention as possible,” Harris said in the interview. “And that … is where the responsibility lies.”
Brian Green, the assistant director of campus ethics at Santa Clara University — a school located in the heart of Silicon Valley, widely considered the epicenter of technology and innovation — said that while technology and social media began as positive opportunities with endless possibility, they have in many ways spiraled into darker places than intended or even considered.
“(Tech companies’) incentives are all wrong, because they’re getting money for our eyeballs looking at their advertisements that they’re giving to us,” Green said. “They figured out ways to start hacking the system and make it do what they want to do, whether spread misinformation or addict people and make them watch advertisements more often.”
“WHAT THE ADVERTISER NEEDS TO KNOW IS NOT WHAT IS RIGHT ABOUT THE PRODUCT BUT WHAT IS WRONG ABOUT THE BUYER.”
According to Lintao, the scrolling mechanism on a smartphone is similar in concept to the lever on a slot machine. A person’s hand stays put, the screen in front of them keeps changing and some type of a reward — maybe a dopamine hit from likes, comments or a certain connection — comes just often enough to keep them scrolling.
“It’s called a variable reward, so you keep going and going and looking until you’re satisfied, and you get the dopamine hit or whatever, and you’re on to the next thing,” Lintao said. “So that’s how it’s designed. It’s very open-ended, with no goal.”
Harris also used the slot machine analogy in his interview on the Rubin Report. Every time a person checks their phone, Harris said, they have a Facebook invitation or a newsletter or a text waiting for them.
And the fact that something could be there — whether that thing is actually good, bad or neutral — is enough to keep people checking their phone nearly constantly. According to a 2018 study from the global tech company Asurion, the average American checks their phone about 80 times a day — or every 12 minutes.
“It’s because … when you turn your phone over, it could be a match on Tinder with that hot (person) that you want to date, it could be that email that’s really exciting, it could be that text message from that person you really wanted to text you or it could just be nothing,” Harris said in the interview. “Every single time you turn it over you’re playing … sometimes you get good stuff, sometimes you don’t.”
Lintao said that most people don’t consciously know why they spend as much time on social media apps as they do.
“If you look at the apps that you do love — no one says they really love Instagram — you love Spotify, you love Nike Run, or Headspace because you have a goal,” Lintao said. “Social media is just an endless hunt for whatever.”
AMERICANS NO LONGER TALK TO EACH OTHER, THEY ENTERTAIN EACH OTHER. THEY DO NOT EXCHANGE IDEAS, THEY EXCHANGE IMAGES. THEY DO NOT ARGUE WITH PROPOSITIONS; THEY ARGUE WITH GOOD LOOKS, CELEBRITIES AND COMMERCIALS.
The ethical questions that technologists face are not simple. Even decisions regarding something as simple as notifications — which feel to most technology users entirely within their realm of control — are made in the offices of technologists who must weigh the psychological and sociological impact of various design features.
“As we were designing the next version of Gmail … how do you, for example, decide, should you buzz people’s pockets when they get a new email?” Harris said in the interview with Rubin. “It’s a seemingly trivial decision. But … dinner conversations and families will either be disrupted or not. Someone will glance and see it and get sucked in or not, all because of this one decision in California (by) a handful of people in a room.”
Irina Raicu, the internet ethics program director at Santa Clara University, said that the impact of social media on democracy through the political conversations that people are having is especially significant. Information was once distributed by gatekeepers who would screen and disseminate the information, but now it’s a free for all.
And the clutter is not promoting critical thinking.
“Everybody is putting information on social media from a variety of sources, and there’s all this noise, and nobody knows who to believe anymore so they believe their friends, because we tend to trust the people we know more,” Raicu said. “But then we end up just reinforcing group beliefs rather than being exposed to others.”
Harris noted in the interview that Silicon Valley is a libertarian culture, where people “make their millions by their own self-will and their own success,” and consequentially, the mentality is that “it’s always people’s individual responsibility how they use everything.”
“There’s this whole narrative that it’s up to us to choose how we use technology, and how many emails we send, and what we post on Facebook,” Harris said. “(But) we are persuadable on every single level, from the attention level to the trust level to the belief level. And so how do you ethically steer the phenomenology of someone?”
Raicu said that technology companies need to take responsibility for the ethical decisions they make.
“Some of the responsibilities (of tech companies) would be to just slow down and think ahead and consult with a varied group of people when they build something, so that they don’t get into their own bubble and think they understand the world,” Raicu said.
While tech companies do have significant power to manipulate and monetize the attention of their customers, Green says that the companies and individuals often disagree over who is responsible for technology’s impact on society.
“The corporations say individuals should do the managing, and individuals say maybe if the technology wasn’t so addictive this wouldn’t happen, and then people say maybe the government needs to pass a law about it or treat it like an addictive drug,” Green said. “There are lots of ways where you could step in to try to do something.”
But tech ethicists are hopeful that things will change as individuals begin to hold technology companies responsible for the decisions they make. According to Green, more technologists are becoming aware of the negative impact of their technology, and actively responding.
“I think one of the things that’s interesting that’s happening right now is that the tech workers themselves have begun to wake up and realize that what they’re doing is having negative effects, and that they don’t want to be involved in making the world a worse place,” Green said. “They got into technology because they’re trying to make the world a better place.”
Green says that responsibility for ethical technology development and use ultimately lies with everyone involved, both the creators and the consumers.
“I think the answer is it has to be everyone,” Green said. “We have to try to build up our own discipline so that we can put down the smartphone or stop looking at the screen when we need to. And it would be nice if the tech companies could help us do that by making their product less addictive.”
“TECHNOLOGY ALWAYS HAS UNFORESEEN CONSEQUENCES, AND IT IS NOT ALWAYS CLEAR, AT THE BEGINNING, WHO OR WHAT WILL WIN, AND WHO OR WHAT WILL LOSE…”
Originally published November 19, 2019.