photo: Globalstock

Filtering The Howl

The way we consume information is turning us into a fatberg of distraction. Is there a way to turn the tide?

King Features Weekly
Published in
13 min readSep 1, 2015

--

By David Cohea

Learning how to return clarity of thought to our distracted, disorganized, hyper-engaged and under-rested brains is perhaps the greatest challenge for media consumers of the 21st century.

It doesn’t help — at all, it turns out — that the digital tsunami that bounds and hounds our waking hours demands that we stay online all the time, constantly switching tasks as our attention is beckoned hither and yon. The chatter of multiple devices and channels and media all going at once is deafening, and it’s hard to escape the roar.

How much so? According to a 2011 study, we ingest on a normal day the equivalent of about 174 newspapers’ worth of information — five times more than we did in 1986. Can’t envision what such a stack of newspapers would look like? Consider this: a 2013 study at the San Diego Supercomputer Center estimates about 9 DVDs worth of data (63 gigabytes) is asked for and dumped into mobile devices per person per day.

Wielding the cognitive scythe through this is our attentional system, deciding what to focus on in every next moment. Multi-tasking is a spiffy way to say we’re swingin’ that scythe hither and yon as we wade through golden fields of data, watching this, texting that, listening to something cool in iTunes while browsing videos on YouTube, playing a game over there while trying to read this piece here.

Multi-tasking may be the dominant dance of the digital age, but as a management style it’s not effective in getting things done. One study by the Institute of Psychiatry at the University of London looked at 1,1000 workers at a British company and determined that multi-tasking with electronic media lowered human IQ more than pot-smoking or missing a night of sleep.

What’s happening? As Daniel Levitin writes in The Organized Brain, the peril of multi-tasking begins with a befogging brain:

Although we think we’re doing several things at once, multitasking, this is a powerful and diabolical illusion. Earl Miller, a neuroscientist as MIT and one of the world experts on divided attention, says that our brains are “not wired to multitask well … When people think they’re multitasking, they’re actually just from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.” So we’re not actually keeping a lot of balls in the air like an expert juggler; we’re more like a bad amateur plate spinner, frantically switching from one task to another, ignoring the one that is not right in front of us but worried it will come crashing down any minute. Even though we’re getting a lot done, ironically, multi-tasking makes us demonstrably less efficient.

Making things much worse, multi-tasking is also addictive. Levitin writes, “the prefrontal cortex has a novelty bias, meaning that its attention can be easily hijacked by something new — the proverbial objects we use to entice infants, puppies and kittens.” Drawn like moths to fire by the prefontal cortex, we eagerly look to the next link; multi-tasking from there, we are doomed to repeat the distraction again and again and again. It’s like the alcoholic who is obsessed with the very thing that sets up all the falling.

Appetite for destruction, indeed.

* * *

If you work in media, there’s no easy way to fend off this frontal assault. Every day awaits with an email inbox brimming with messages and link-flush newsletters, and a wide-eyed Facebook feed scrolls down to infinity. The course that leads to the day’s deadlines is fraught with digital candy at every step of the way.

It doesn’t help things that digital disruption spreads enormous anxiety around it, a gauzy-white mushroom field of bad statistics and doomy prognostications. Trailing vast streams of anxious pixie-dustn, we flail away digging deeper into the future, hoping to claw out some, any purchase on a secure future. Only to get mired ever deeper.

* * *

And what happened to the news while we were getting so distracted? Does our disconnect help account for the scattershot, deer-in-the-headlights imprint of the news upon most consumers of news? When a clown of a candidate looks to clear the fences on a disruptive candidacy, does anyone in cyberspace scream? If the stock market falls a thousand points in a week; or a thousand migrant souls drown off the shores of Greece; or a 2-million-year-old glacier melts and fades completely away: How would we know the world had fallen apart, what with our gaze glued fast to a screen?

That part — screen fixation — seems to be the biggest part of the problem.

Adam Westbrook of The Memo suggests that web browsers are a big part of the problem; they encourage browsing links and tabs in an endless assault of new information. Who digests anything that way, when there’s always something next to view?

Is it addiction to the new — to endless stimulation — that has children running amok on ADHD, and students are guzzling energy drinks to get course work done, and workers gobbling Adderall and Ritalin to keep up with soaring high-tech deadlines and work schedules? A 2013 report by the federal Substance Abuse and Mental Health Services Administration found that emergency room visits related to nonmedical use of prescription stimulants among adults 18 to 34 tripled from 2005 to 2011.

What happened to the world while we were feeding frenzy to the frenzy?

* * *

Has our perspective on the news been altered by the way we are devouring it? Levitin cites research showing that multi-tasked knowledge may be improperly uploading into the brain:

Russ Poldrack, a neuroscientist at Stanford, found that learning information while multitasking causes the new information to go to the wrong part of the brain. If students study and watch TV at the same time, for example, the information from their schoolwork goes into the striatum, a region specialized for storing new procedures and skills, not facts and ideas. Without the distraction of TV, the information goes into the hippocampus, where it is organized and categorized in a variety of ways, making it easier to retrieve. MIT’s Earl Miller adds, “People can’t do (multitasking) very well, and when they say they can, they’re deluding themselves.” And it turns out the brain is very good at this deluding business.

Reminds me of an episode of Nova where scientists were testing fish in CO2-rich waters to see if their brain chemistry was being affected by the ocean’s growing acidity. And sure enough, a fish that was known to hug coral reefs for protection, actually was lured out by the smell of a nearby predator. Its natural circuitry had been turned upside down, so it could no longer instinctively swim in the right direction.

Are we headed exactly where we oughtn’t to be going, trapped in distraction’s tractor beam?

* * *

Digitally disrupted platforms speed in the direction of distraction, finessing the operation, making it ever-easier to commit the same sins.

If Web 2.0 is a product of the distractions of Web 1.0, is is more of cage for flighty thought? And will Web 3.0 be exponentially worse on consciousness than its predecessors?

Does the aroused brain seek out bad news? Outrageous news? Patently false news? Addicts seem to indicate this. Last week, there were 12 heroin overdoses in the same housing complex within 24 hours; three died. What was so convincing about the white whisper that the entire herd ran off the cliff?

* * *

A centuries-old demonstration of distraction called the Thaumatorpe shows that when two objects are flashed before the eyes in rapid succession, the combination of images forms a single impression, as if the two items were one concept. Paint a bird on on side of a card and a cage on the other, flip them in fast succession, and you get a bird in a cage.

What does the brain think when it’s juxtaposing things too quickly? What happens when we see Ice T and then ISIS? The Dow Jones and Ashley Madison? Donald Trump and Red Bull? Grumpy Cat and Tim Cook? Katy Perry and Hillary Clinton?

What kind of lead-headed, rambling, punk-greyed new world is devolving from the litter of mash-ups like these happening 24–7–365?

What year is it? And whose Earth are we losing?

* * *

The absolute inevitability of the slide into digital technology and digital life makes resistance seem old-school, if not flat-out stupid and wrong. James McQuivey says as much in Digital Disruption: Unleashing the Next Wave of Innovation:

Digital tools allow digital disruptors to come at you from all directions — and from all ages, backgrounds and nationalities. Your competitors probably won’t come from within your industry — they could come from any industry, or from one that doesn’t exist yet. Equipped with a better mindset and better tools, thousands of these disruptors are ready to do better whatever it is that your company does. this isn’t just competitive innovation, it’s a fundamentally new type and scale and speed of competitive innovation. And it will totally disrupt your business, even if that business has nothing to do with digital. Because the mindset may be digital and the tools will definitely be digital, but that mindset and those tools can be used to disrupt any industry faster than an old disruption could. (7)

It’s why Google and Amazon and Apple get no competition from the news business they disrupted. It’s probably why the days of print newspapers are so numbered.

But as legacy … pre- or extra-digital — there may be something yet to bequeath …

* * *

It’s not that there aren’t ways to resist or slow the slide, ways in which may resistance may be quite beneficial for journalism.

According to Levitin, human consciousness evolved on a hierarchy of attentional filters that help us to focus on just one thing at the moment. Highly successful — and wealthy — people often hire layers of people to handle everything but their most important task. The job of these people is, as Levitin puts it, “to narrow the attentional filter.”

Here are some unrelated ways such work can be done:

  • A new noise-filtering technology by Meyer Sound Laboratories is becoming a hit in restaraunts, providing diners a meal without all the barking, laughing, crashing, tittering distraction from other tables. Diners are allowed to pay attention to only what counts most to them. Story in The New Yorker here.
  • Just as ad-blockers are coming into the fray to reduce some of the most annoying attentional noise in feature-rich browsers, other filters restricting peripheral attention-grabbers may find their way into much-simplified browsing platforms. Rather than all in one, one: such limitations work exceedingly well in blog platforms like Medium.
  • Another thing brain researchers have discovered is that the brain, like many other organs, cycles through periods of activity and rest. One suggestion to get optimal performance from the brain is to alternate between period of intense, focused activity (up to 90 minutes) with down-times of rest (2 to 15 minutes).
  • A slight variation on that is a cycle which Levitin identifies is between the attentive brain (this is how pyramids get built) and what he calls the “daydreaming mode,” where the mind wanders, making disparate connections, enabling creativity. He suggests that rather than switch endlessly between items calling for our next attention (fatiguing the brain), instead the day should be partitioned between periods of focused activity and equal-sized periods of mind-wandering, allowing the brain to recharge.
  • Digital tools can help with digital overload: email rules and filters, Google news alerts, and Twitter lists help automate repetitive tasks. Learn how to conduct more specific Google searches. Slack is a messaging app that’s great for teams as well keeping track of projects. Use your LInkedIn network to work with the professionals you need to keep in touch with most often.
  • Get away from the wow-factory of browsing; try read offline. GetPocket is a great way to save stories from the Web and read out at a later time; they can be organized by tags and print out (if you can afford the paper) in very reader-friendly format. (Often when there’s tons of clutter on a web page that has an article I like, the GetPocket version is single-story focused.) Create a reading pile and work through it.
  • Slow it down. There’s nothing wrong with longform other than you can’t be doing a lot of things while you’re reading it. Getting through a 10,000-word article is like training for longer endurance tests. The brain stays focused, on target. Great piece by Michael Blanding for Neiman Reports here.

* * *

Baby Boomers are the transitional generation where most of the collective cost and disrupted bleeding is to be found.

Certainly, the rules for success in the digital marketplace are vicious and exclusionary. Amazon may well rule the media world (as Facebook rules the platform by which most of us do our daily digital commutes), but there can only be one of anything on top of any digital world. What a difference between Amazon’s stock price (currently $512 a share) and legacy newspaper also-ran McClatchy, which recently went on a buying spree of its own stock and paid down some $23 million in debt in order to buoy its stock price up to $1.26 a share.

Maybe our present bloody evolution will eventually produce sustaining results for the species. Some people, for example, actually excel at multi-tasking — or, at least, their performance doesn’t seem to be harmed by it. Super-Taskers: why not? we have humans with extraordinary senses of taste or smell, and hyperthymesiacs who can recall, on a moment’s notice, what they were doing the morning of April 22, 1983. (Unfortunately, while they didn’t actually perform better than the control group, the brains of Super-Taskers were much cooler going about the tasks, signaling higher neural capacity.)

Developments in hive-mind — networked consciousness — present another solution to the multi-tasking abyss by feeding out attentive task to individual brains in the network. The same idea has proven valuable for distributed computer networks. The SETI Institute uses Internet-connected computers (1.5 million to date) to help process data collected from radio telescopes to scan the sky — each computer, a small slice — for signs of extra-terrestrial life. Distributed intelligence may be the answer to the human multi-tasking challenge as well, limiting each individual to a simple binary activity of yes/no. Not sure anyone would want to limit their attentional feed that way, but then maybe notions of the individual are part of cultural disruption. (Great piece on the challenges and opportunities of hive consciousness in this essay by Peter Watts.)

But for those of us who can’t (or simply won’t) keep up with the times — especially as it grows towards speeds faster than human intelligence — the new normal, I fear, will be obsolescence: what we used to call the merely digitally disrupted. Obsolete has such a ‘50s ring to it, all of those fears of humans being replaced by automation when computing first hit the workplace. Only now, half a century later, does the term show real teeth in tearing apart a career.

How much different, really, is the cultural memory hole created by disruption, from the individual affliction of Alzheimers, estimated to increase by 40 percent in the next ten years and affect some 7.1 million Americans. It’s the same sort of absolute removal from the mainstream that the permanently unemployed — copy editors over age 60, say — might feel.

Medical solutions may be on the horizon for sufferers of Alzheimers. Researchers have found a way to artificially stimulate the hippocampus to get it to recall memories. But are there equivalent solutions to recalling the digitally departed from the cultural memory hole?

Without aids like these to assist with Alzheimers, society could easily divide between “those who can recall who the President is and those who cannot,” writes Charles Leadbeater in his essay “The disremembered. “ “People lucky enough to have a fully functioning memory find themselves into the roles of carers and keepers, controllers and jailers: it will not be pleasant for them either.”

Is that what we suffer on a grander scale now, with society dividing between the connected and the cut? And is it the culture’s responsibility to maintain a tether to the disrupted, to not let them simply float off into space?

Meanwhile, the mass and its media wallows along like that 15-ton fatberg discovered bumping along in the London sewers, ever picking up more junk for the trunk. Porn packages sometimes now top 1 terabye in streaming downloads, and just wait until the Internet of Things joins the Streaming Flood in the bandwidth. Will such obesity eventually shatter all the benchmarks?

So much to pay attention to, so little time. Now more than ever, it would seem, that focus is the irreplaceable commodity for avoiding the future no one wants to fall into. It is quite possible that we’ll get so caught up in the switching of tasks at hand, distracted so by the rhythmic sound of changing gears, that we begin to think that that sound is all there is to living. It will be, if we allow ourselves to be fooled or lulled into confusing digital tools with digitism, the philosophy that the only path to the future is through servitude to the laws of digital disruption.

In a powerful essay, “Among the Disrupted,” Leon Wieseltier says that resistance may be the true and perhaps only humanistic response to digital disruption:

All revolutions exaggerate, and the digital revolution is no different. We are still in the middle of the great transformation, but it is not too early to begin to expose the exaggerations, and to sort out the continuities from the discontinuities. The burden of proof falls on the revolutionaries, and their success in the marketplace is not sufficient proof. Presumptions of obsolescence, which are often nothing more than the marketing techniques of corporate behemoths, need to be scrupulously examined. By now we are familiar enough with the magnitude of the changes in all the spheres of our existence to move beyond the futuristic rhapsodies that characterize so much of the literature on the subject. We can no longer roll over and celebrate and shop. Every phone in every pocket contains a “picture of ourselves,” and we must ascertain what that pictures is and whether we should wish to resist it. Here is a humanist proposition for the age of Google: The processing of information is not the highest aim to which the human spirit can aspire, and neither is competitiveness in a global economy. The character of our society cannot be determined by engineers.

Digital disruption is consuming us so fast that there may be no way to escape it. However, we can, as Wallace Stevens suggested, resist it “almost successfully,” producing some cultural distance to view it from and still call it for what it only is. How? Try reading this offline.

Go slow.

Resist.

--

--

King Features Weekly
Local and Thriving

Entertaining extras for community newspapers — today, tomorrow.