On Neo-Luddites and Optimists in the 21st Century Internet Age
Today’s Digital Revolution is a Story of Yin and Yang Opposites
Consider many of the technological innovations that have developed over a relatively short period of time and dramatically changed the way humans live and work today.
It has only been 28 years since Tim Berners-Lee invented the World Wide Web. The early 1990s heralded in the first smartphones, and it wasn’t until 2007 that iPhones hit the mobile marketplace. The iPad was launched in 2010. Facebook launched in 2004. Google was founded in 1998, and Twitter came along in 2006. GPS did not even begin to gain wide acceptance with the public until the mid-1990s. IBM’s Watson demonstrated its powerful artificial intelligence on Jeopardy a little over seven years ago in February 2011.
Increasingly growing, twenty-first century developments in such areas as big data, information technology, artificial intelligence, biotech, robotics, blockchain, cognitive computing, machine learning, and more continue to sketch out a fast-changing, not-too-distant future with all sorts of implications wrapped in opposite Yin/Yang-oriented energies related to how humans might thrive in comfort and peace — or not.
Perceptions Both Dark (Yin) and Bright (Yang)
The Chinese Yin and Yang philosophy, as noted on the Ancient History Encyclopedia website, dates back to the third century B.C.E. Its main tenets place everything within the two opposites of darkness and light.
Neither pole is superior to the other and, as an increase in one brings a corresponding decrease in the other, a correct balance between the two poles must be reached in order to achieve harmony. . . As expressed in the I Ching, the ever-changing relationship between the two poles is responsible for the constant flux of the universe and life in general. When there is too great an imbalance between Yin and Yang, catastrophes can occur.
The Yin (dark side) of technological advances suggests that the rapid rise of innovation has the potential to cause widespread dehumanization. For example, a modern version of Luddite-like-thinking sees the digital revolution contributing to a rapid rise in depression, an overabundance of unhealthy social interaction, an immense destruction of jobs, unprecedented inequality gaps, and Armageddon on the near horizon.
The Yang (bright side) of technological advances suggests that the rapid rise of innovation always has, over the long run, brought positive change in which people work less and are generally happier and extraordinarily productive in all walks of life. These new technologies catalyze brilliant social and economic benefits for the majority of humans.
Are the “Yins” uninformed Luddites, and are the “Yangs” overly optimistic futurists? Can a balance be achieved that’s somewhere between these two opposites?
Progress or Destruction
“Technology is widely considered the main source of economic progress, but it has also generated cultural anxiety throughout history.” That’s the opening sentence of an essay written by Joel Mokyr, Chris Vickers, and Nicolas L. Ziebarth, three Ph.D.’s in economics who co-authored “The History of Technological Anxiety and the Future of Economic Growth: Is This Time Different?” — published by the Journal of Economic Perspectives in its summer 2015 issue.
Is such anxiety justified today? From an historical point of view, Mokyr et al. say it’s not, citing the Industrial Revolution in the nineteenth century as one of many examples they present in their essay about positive technological change. However, they also claim that “it is surely not without precedent that the developed world is now suffering from another bout of such angst.” Additionally, at the end of their essay Mokyr et al. make the following optimistic claim about mankind’s adoption of new technologies in the twenty-first century, giving people more freedom to do what they want: “The long-term trend toward greater leisure will continue, and one can even imagine an economy that reaches the stage in which only those who want to work actually will work.” How great is that?
For an opposite point of view, Yuval Noah Harari, in his newest book “21 Lessons for the 21st Century,” released in September 2018, asks, “are we on the verge of a terrifying upheaval, or are such forecasts yet another example of ill-founded Luddite hysteria?” His answer: “It is hard to say,” adding this time things could be much different. For instance, there are sound reasons to believe twenty-first century developments in machine learning and job automation “will be a real game changer,” with very scary and enormous job-loss potentialities that require close scrutiny and vigilance. Yuval claims:
The technological revolution might soon push billions of humans out of the job market and create a massive new “useless class,” leading to social and political upheavals that no existing ideology knows how to handle. All the talk about technology and ideology might sound very abstract and remote, but the very real prospect of mass unemployment — or personal unemployment — leaves nobody indifferent.
Overall, Harari paints a bleak picture of a new technology-disrupted world overrun by bad actors messing with electronic finance systems, making the latest medical discoveries available to only the wealthy, pushing the boundaries of ecological collapse, screwing up politics and voting, manipulating power structures that will make vast swaths of the world population irrelevant, and grossly intruding on peoples’ privacy.
Economics Professor Tom Lehman, in “Countering the Modern Luddite Impulse,” published in The Independent Review Fall 2015 issue, paints an opposite picture:
Since the shift to an information economy and the dawn of the “Internet Age,” pundits and scholars of all stripes have popularized the notion that advances in information technology embodied in robotics and automated production machinery will mean “the end of work” (the title of Jeremy Rifkin’s polemic published in 1995) and perhaps the end of civilization as we know it. Even some economists, a group who should be naturally skeptical of Luddite fallacies, are instead lending credibility to them.
New Versions of So-called Luddites
Some of the so-called “pundits” and “scholars,” have solid arguments that reflect a new and different kind of twenty-first century Luddism. Jamie Bartlett in The Guardian article “Will 2018 be the year of the neo-luddite?” points to technology writer Blake Snow, for example, who is author of the book “Log-Off: How to Stay Connected After Disconnecting.” Snow came up with the term “Reformed Luddism: a society that views tech with a skeptical eye, noting the benefits while recognizing that it causes problems, too.” Is this the kind of balance between all the Yin and Yang viewpoints on technology adoption that needs to be practiced for peace and tranquility?
Snow wrote the following in “Log-Off”:
If we don’t stop to re-calibrate our relationship with these powerful, life-changing, but also life-hindering, technologies, in just a few years we’ll go from alarming habits to widespread burnout, debilitating attention spans, and universal digital anxiety disorder. We’ll become slaves to the very devices that were meant to free us.
Similar to Harari, Bartlett addressed a litany of sorts in The Guardian article, including how social-media advertising is disruptive and unhealthy; how more people are turning their phones and computers off and disconnecting; how parents fret over their kids’ screen time; and how “there is a palpable demand for anything that involves less tech,” including an increase in what he refers to as “Unplugged Festivals, offering the chance of “switching off for the weekend … No wifi, no 3G, no traditional electricity.”
Take, for example, the new National Day of Unplugging movement, slated for March 1–2, 2019. The movement promotes “a 24-hour global respite from technology. It highlights the value of disconnecting from digital devices to connect with ourselves, our loved ones and our communities in real time.”
Working the Yin/Yang Opposites
“I am a neo-Luddite,” admits Engineering Professor Andrew Lau in “Last Word: Luddite With a Laptop,” an article he wrote that was published by ASEE Prism in 2007, but Lau might be better identified as a Reformed (Yin-Yang oriented) Luddite, because, as the title of his article suggests, he depends on his laptop as an essential tool that facilitates his work as a teacher at Penn State University, while at the same time he confesses how his frequent laptop usage negatively interferes with his personal life.
Or consider Nicholas Carr’s popular essays and books. Carr is a regular blogger, but he is perhaps best known for lamenting about the notion of the Internet fracturing peoples’ thinking, including his own challenges for dealing with it. In the worth-repeating second paragraph of his widely acclaimed Atlantic article in 2008 — Is Google Making Us Stupid? — he explained what was happening to his brain from too much Internet surfing:
Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going — so far as I can tell — but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages.
The popularity of his Atlantic article set the stage for several books he wrote that continued with this kind of invective, including The Shallows: What the Internet is Doing to Our Brains, a finalist for the 2011 Pulitzer Prize in nonfiction. Here he wrote that “Google [insert all search engines here] is, quite literally, in the business of distraction.”
Carr is joined by other Reformed Luddites. In the first “Argument One” chapter of his May 2018 released book, “Ten Arguments for Deleting Your Social Media Accounts Right Now,” award-winning Computer Scientist Jaron Lanier, known widely for his work in the field of virtual reality and Internet 2, writes how the monitoring and collection of data through sophisticated algorithms practiced by today’s social media giants and other corporations “is unethical, cruel, dangerous, and inhumane. Dangerous? Oh, yes, because who knows who’s going to use that power, and for what?” Harari voiced similar sentiments throughout his book.
The Kids Aren’t All Right
In Psychology Professor Jean M. Twenge’s long-titled August 2017 book, “iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy–and Completely Unprepared for Adulthood–and What That Means for the Rest of Us,” she claims post-Millennials (those born between 1995 and 2012) are experiencing unprecedented levels of poor mental health due mostly to their over-reliance on digitized communications.
In a September 2017 article Twenge wrote for the Atlantic, she explains that “the twin rise of the smartphone and social media has caused an earthquake of a magnitude we’ve not seen in a very long time, if ever. There is compelling evidence that the devices we’ve placed in young people’s hands are having profound effects on their lives — and making them seriously unhappy.”
In the iGen book, she offers the following sobering analysis of iGen’ers:
They socialize in completely new ways, reject once sacred social taboos, and want different things from their lives and careers. They are obsessed with safety and fearful of their economic futures, and they have no patience for inequality based on gender, race, or sexual orientation. They are at the forefront of the worst mental health crisis in decades, with rates of teen depression and suicide skyrocketing since 2011. Contrary to the prevalent idea that children are growing up faster than previous generations did, iGen’ers are growing up more slowly: 18-year-olds now act like 15-year-olds used to, and 13-year-olds like 10-year-olds. Teens are physically safer than ever, yet they are more mentally vulnerable.
The Broadband Conundrum
While this prominent Yang-oriented theme stresses that everyone should disconnect from the digital world on a more frequent basis, the notion of completely disconnecting could be unhealthy. In the essay “What are the Consequences of Being Disconnected in a Broadband-Connected World?” published by the American Academy of the Arts & Sciences, John B. Horrigan explains how being unable to access broadband has extraordinarily negative fallout. Horrigan, who is Senior Fellow at the Technology Institute in Washington D.C., provides a long list of broadband-access benefits, yet he also posits that there are millions of Americans who still have too few broadband-access options. While his essay was written in 2011, much of what he wrote still applies today.
The benefits of modern broadband Internet access stretch across numerous important day-to-day living activities. People enroll in online education courses and programs, read the news online that’s important to them, devour less-expensive books through e-book readers, search and apply for jobs online, conduct their banking over the Internet, apply for government services, renew their driver licenses, research important healthcare options, and interact with their doctors and nurses — all through broadband access.
“A lack of access to information means that people lose out on an opportunity to be more empowered and participatory in health care decisions,” Horrigan writes. “There is also evidence that broadband can improve the efficiency with which health care providers deliver care to patients.” He also states, in general, that there are two ways to look at the notion of digital exclusion in today’s society: For individuals, who “miss out on information” that’s important to their lives; and for society, whereby it becomes important to support cost-effective broadband delivery “in order to meet the needs of a shrinking minority of people who do not have access to broadband, or who have access but lack the skills to use it.”
At the same time, Hoorigan says, “it is important to resist the temptation to think about broadband as something everyone has to have. A necessity is not a requirement.”
Ironically, U.S. home-based broadband access statistics from 2009–10 that Horrigan presented in his essay are unchanged. He mentions that a Pew survey published in April 2010 found that 66 percent of Americans had broadband at home. An “Internet/Broadband Fact Sheet,” published by Pew in February 2018 notes that “today, roughly two-thirds of American adults have broadband Internet service at home.”
Our Well-Being at Stake
The Pew Research Center has a large body of work focused on the Internet. In an April 2018 Pew report, “The Future of Well-Being in a Tech-Saturated World,” Pew researchers Janna Anderson and Lee Rainie reveal similarities to the Yin/Yang theories presented here. They write:
A plurality of experts say digital life will continue to expand people’s boundaries and opportunities in the coming decade and that the world to come will produce more help than harm in people’s lives. Still, nearly a third think that digital life will be mostly harmful to people’s health, mental fitness and happiness.
One of the many interesting quotes from a variety of Digital Age experts they quote in their report comes from Rob Reich, professor of political science at Stanford: “The massive and undeniable benefits of digital life — access to knowledge and culture — have been mostly realized. The harms have begun to come into view just over the past few years, and the trend line is moving consistently in a negative direction.”
And then, to end on a high note, of course, there’s the other side of the equation, with the following quote from Daniel Weitzner, founding director of MIT’s Internet Policy Research Institute: “Human beings want and need connection, and the Internet is the ultimate connection machine… I have to feel confident that we can continue to gain fulfillment from these human connections.”
Originally published at uxyzblog.com on October 12, 2018.