The evolution of the media landscape (and why you probably won’t finish this article)
By Joshua Leigh, Lead Visual Designer — Method London
At a time when many sectors are ripe for disruption, the publishing industry has been doubly affected. Not only is publishing itself being disrupted by new technology, media and changing consumption habits, but the advertising industry, on which it has relied for so long for a business model, is also under disruption. Failure to act quickly and foresee these changes has left publishing searching for a viable alternative model.
It’s a race to the bottom where eyeballs and clicks are the prizes. Fake news, bot-generated articles, echo chambers and walled gardens make up this new media landscape. How did we get here and how can we expect things to evolve next?
In the beginning…
When Gutenberg disrupted the scribing industry in 1440, he created the means for mass communication and paved the way for the Age of Enlightenment, the Renaissance and the scientific revolution. However, he could never have imagined the modern media landscape that we now inhabit.
While the printing press has benefited from improvements and technological advances over the years, from mechanization to digital presses, the most drastic innovation in publishing came with the advent of the World Wide Web. Where Gutenberg had made the replication of content possible, the Web democratized the distribution, lowering the barrier to the access of information further than ever before.
In its infancy the breadth of content and the distribution channels were limited (remember the agonizing sluggishness of dial-up and the limited Geocities content?). However, we are now at a point where the Web is instant, pervasive and has more content than anyone could ever consume in a number of lifetimes. Yet the publishing industry is still wrestling with some fundamental questions: How do we profit from published content? How do we uphold quality and integrity amid a tidal wave of user (and bot) generated content? How do we ensure that readers can still have shared experiences rather than retreat into tailored, targeted, personalized bubbles?
This description of the impact of the printing press on society and education could just as easily describe the impact of the Web:
“The relatively unrestricted circulation of information — including revolutionary ideas — transcended borders, captured the masses … and threatened the power of political and religious authorities; the sharp increase in literacy broke the monopoly of the literate elite on education and learning and bolstered the emerging middle class”
However we are now seeing this ‘unrestricted circulation of information’ being threatened on a number of fronts. Corporate interest in the web and our access to it has the power to change how we access information. Net neutrality is now under threat and could potentially pave the way to a tiered internet with restricted access to various kinds of content depending on your location, subscription or service provider.
Monopolies also have the potential to change the way we consume content. GAFA (Tech giants Google, Apple, Facebook and Amazon) control a large part of how we consume content on the internet. They not only create their own walled gardens but they filter and personalize our experiences, pandering to our tastes and serving us with targeted advertising. To an extent, we see the content we want to see, based on the preferences our online behaviors imply. People have always chosen to read news that supports their worldview, perhaps opting for a right-wing tabloid or a left-wing broadsheet. However, where once we could see what the other side was reading, tutting at headlines we disagreed with, now this filtering of content happens without our knowledge and behind closed doors. We can all read news stories via Facebook, but we have no idea what others’ newsfeeds look like, which stories they are reading and from which sources. Whatever the intention, it can’t be doubted that our personalized experiences lead to alternate realities for each of us. Should we really be surprised by events such as unexpected election outcomes or are they a symptom of our new distorted realities?
How have we arrived at this state of affairs?
Looking for a new business model
In the past, the business model of most newspapers revolved around the cover price plus the sale of advertising space and classifieds. Advertisers would pay a premium for their advert to appear in a prominent position in the paper, safe in the knowledge that every reader would see the ad in the same place on the printed page. The internet has since disintermediated published advertising. There is no cover price for online news (we’ll come to paywalls later). Programmatic advertising such as Google AdWords is predominantly controlled by the internet giants. Where once newspapers would have an advertising sales department negotiating ad costs and placement, today most of that work can be automated. Now ads are not sold on a publication by publication basis, rather they are sold in packages that will appear wherever their target demographic is reading on the web. Newspaper sites can’t offer anything over and above what any other website can offer in terms of advertising real estate. In fact, advertisers would prefer to have their content nested around a site that a user visits multiple times per day — such as Facebook — than on infrequent visits to a news site. News sites can’t compete with the social media giants on scale of site traffic. More often than not, content from all manner of sources is aggregated on a user’s social media feed anyway. On the web users don’t read news in a linear fashion from one source and advertising appears wherever and whenever a user is active.
Content is becoming more and more disposable and nowhere is this more visible than in the advertising-driven sites vying for the eyeballs and clicks of the low attention span reader. The 24-hour news cycle has made us hungry for stories throughout the day, we demand an endless stream of news. With this constant barrage of content, inevitably quality and due diligence fact checking often suffer in the rush to output more and more stories. The fight for clicks is a race to the bottom in terms of quality and this can be seen in successful sites like Buzzfeed and the Mail Online — now the most popular news website in the world. The more eyeballs on a page, the higher the advertising revenue that can be generated for the site. The extreme of this model can be seen in the advent of ‘fake news’. Sites created with bogus, salacious content, masquerading as news, purely designed to drive clicks to sell advertising.
The filter bubble
Gutenberg’s printing press allowed for the widespread dissemination of information. Multiple copies of the same work could be mass produced and readers could then have a discourse about a common point of information. The printed word was finite and unchanging and reading a book was a finish-able task.
Cut forward to today and we have access to more information than ever before. All of the world’s information is shared and accessible. Today the problem becomes one of curation and prioritization. With so much content being created, how do we know what to read and when?
GAFA have got this one covered. Their answer is to tailor content to each user based on their online behavior. While this may suit many users, its intention being to give users the best experience, it does raise some issues. Is it right to feed people with content that they desire? Are we supposed to enjoy reading the news? Are common points of reference important for society?
In recent years we have begun to see the effects of this targeting of content. Users often surround themselves with those of a like minded outlook. 62% of US adults get news from social media. They prefer to read content that agrees with their worldview. This has created an echo chamber for many users and increased polarization between those of opposing views. It has created a personalized environment that protects them from certain realities. We can see the effects of this in recent elections in the US and UK which, for many, played out with surprising results. (See books The Shallows and The Filter Bubble for more on this topic)
Data driven content
With every click and scroll you make on the web, data points are generated and may be collected. Your actions are logged and services can then use these behaviors to tailor content toward you all over the web. We barely register when we click ‘Accept’ on Cookies warnings here in the EU on many sites. With warnings such as:
This is one of the means by which our behaviors are tracked. It’s not just your actions as an individual that are tracked but also those of the crowd. News sites can reorder and prioritize content based on what is proving to be the most popular content. In theory they can also use this data to create content they know will prove to be popular. It will tell them optimum article length, writing style, placement of images and more.
Video platform Netflix uses this kind of user data when creating original content for their platform. More than ever before they know as an entertainment studio what type of content to produce, which director’s style will work, editing style, musical score. The question then becomes, should producers of entertainment or any type of content be dictated to by the wisdom of the crowd?
Citizen journalism and user-generated content
The advent of the personal computer saw the introduction desktop publishing. Users could for the first time create digital content and print it at home. However, it was not until the introduction of the web that user finally had the power to publish and distribute their content globally. At first this was through basic means such as newsgroups and bulletin boards but as the web reached maturity we saw the rise of more advanced user-generated content platforms. At Wikipedia’s inception the concept of a user-generated encyclopedia seemed liked a recipe for disaster, or at least a source of low quality information. However, through the commitment of a dedicated number of contributors and editors (‘wikipedians’), the site is now the go-to source for information and often has more in-depth and up to date content than any traditional encyclopedia ever could.
Medium.com has become a go to platform for writers of all backgrounds to share their articles. It provides tools for social annotation and commenting and its community of users are a ready audience.
Established news sources and publishers aren’t necessarily going anywhere any time soon, they are still amongst the most trusted and widely read media out there, however with a wealth of content being produced by amateurs as well as professionals they are being squeezed from all angles and need work hard to maintain their place. Some established players have embraced citizen journalism, offering space for readers to have their say, not just in below-the-line comments but with dedicated platforms of their own.
Bot-created and automatically generated content
The web has now given all users the publishing power that once only journalists wielded. Anyone and everyone has the chance to make their opinion heard for better or worse. How many of us now read social media posts and blog articles over professional journalism? Has this led to a lowering of our standards for critical writing? In the tsunami of content on the web, can we discern high quality, well researched and sourced journalism? In reading this article, you are reading the words of blog post by a non-professional writer.
The majority of industries will be subject to automation in some sense eventually and journalism is no exception. Artificial intelligence has made it possible for bots to compose content using Natural Language Generation. It’s quite likely that you’ve read a news summary or sports match report written by a bot and not known it.
Narrative Science and Automated Insights are just two companies that offer such services.
The Washington Post (now owned by Amazon’s Jeff Bezos) has been experimenting with bot-written articles using an artificial intelligence technology they called ‘Heliograf’:
“The future of automated storytelling is the seamless blend of human reporting and machine generated content,” said Dr. Sam Han, director of data science at The Post
If advertising sales are no longer a functional business model for newspapers and other publishers, then it seems that the subscription model is where many are headed.
This is already a model used by other media companies. Subscription based services that aggregate content are the norm for both film and tv (Netflix) and music (Spotify).
Individual publishers are now experimenting with different business models and paywalls are now fairly common. There are some services who are trying to aggregate content from a number of news sources and charge a subscription for access to all. Blendle is one such service.
It remains to be seen whether paywalls and subscriptions to individual news sources can work in the long term. The implications of a ‘walled garden’ approach to news publishing are similar to those of the social media filter bubble. If news ends up moving away from the open web and if we all align ourselves to outlets that fit our worldview, are we not in danger of isolating ourselves from each other and splintering our realities further?
Who could afford to subscribe to a range of individual news sources from a variety of political outlooks as well as the many other online subscriptions we already have? Are we in danger of reaching peak subscription?
Authenticity, trust and provenance in publishing on the web
Although the platforms and methods of distribution of content have changed drastically over the past twenty years, the dominant sources of news are still those with most heritage and authority.
Readers trust professional news outlets more than social media but are still wary of biases inherent in each outlet.
When consuming news online the original source of a story can often be obfuscated, especially when an article is shared via social media or summarized elsewhere. On social media, headlines from a number of news sources may be placed alongside each other with no proportional prioritization for professional news outlets. Fake news can be given as much prominence as an article from the New York Times. In a recent UK study only 4% of users could correctly identify fake news stories.
This is something that platforms such as Facebook are now looking into solving. Facebook now allows for news stories to be ‘disputed’ and flags them as such.
Authenticity on the web, particularly for news is somethings that is yet to be fully solved. With an ever growing quantity of content being generated, both by humans and their bot counterparts, verifying the quality and authenticity of news is more important than it has been before.
Clarity and truth in a 24hr news cycle
The speed at which we now consume news has also changed since the advent of the web. Cable news stations were first to bring us rolling 24-hour news. With more airtime to fill they generated more content and repeated cycles of stories until developments appeared.
Now with the web we can go one further. The web is always on, a story can break in any time zone and be reported globally, instantly. We don’t even need news outlets to report stories. A story is just as likely (if not more so) to break on Twitter or Facebook than on a news site. Most citizens are now armed with a portable equivalent of a video camera and typewriter (otherwise known as a smartphone), anyone can report directly from the scene of a news story. We are all reporters. However, with this ability also comes a greater chance for misrepresenting the facts, be it unintentionally or otherwise.
‘Breaking News’ is a headline that is now an everyday occurrence but it’s also something of which we should be wary. The facts rarely surface until the dust has settled and panic and misinformation can now spread just as fast as information from a credible new source.
What approaches and systems can be put in place by news publishers to verify their sources and to keep a level of objective truth in the hectic environment of the 24-hour news cycle?
With the micro focus of the 24-hour news cycle short-term stories can be elevated in importance and the longer term macro stories can be lost. Are we losing sight of the forest for the trees?
Coda is an organization that provides a more long-term perspective for ongoing news stories. They put a team of journalists on a story and follow it even after the spotlight has moved away.
With our ever decreasing attention span, should the job of major news organizations be to provide oversight and a more long term perspective to the news? Could the simpler, short-term reporting can be left to bots, automation and citizen journalists?
Publishing is facing disruption from a number of fronts — technological advances, changing reader behaviors and a broken business model reliant on advertising (itself undergoing disruption). Each of these disruptive forces provides a chance to reconsider the future role of publishing and the news media going forward. How can new technology allow for greater transparency and accountability? Could technology such as Blockchain allow for better authentication of news stories and provide tools to prevent fake news?
Could new business models allow for reporting free from the corporate influence that advertising once held?
Will the automation of basic news reporting free journalists to think more deeply about the broader picture and provide valuable insight, giving meaning to the facts of a story?
Those that see these issues as opportunities to change, not threats will be successful. Digital publishing is in its adolescence, how it matures will be shaped by our solutions to the problems highlighted above.