Hyper-compensation: Ted Nelson and the impact of journalism

Tow Center
Tow Center
Published in
4 min readAug 6, 2014

NewsLynx is a Tow Center research project and platform aimed at better understanding the impact of news. It is conducted by Tow Fellows Brian Abelson, Stijn DeBrouwere & Michael Keller.

“If you want to make an apple pie from scratch, you must first invent the universe.” — Carl Sagan

Before you can begin to measure impact, you need to first know who’s talking about you. While analytics platforms provide referrers, social media sites track reposts, and media monitoring tools follow mentions, these services are often incomplete and come with a price. Why is it that, on the internet — the most interconnected medium in history — tracking linkages between content is so difficult?

The simple answer is that the web wasn’t built to be *fully* connected, per se. It’s an idiosyncratic, labyrinthine garden of forking paths with no way to navigate from one page to pages that reference it.

We’ve spent the last few months thinking about and building an analytics platform called NewsLynx which aims to help newsrooms better capture the quantitative and qualitative effects of their work. Many of our features are aimed at giving newsrooms a better sense of who is talking about their work. This seemingly simple feature, to understand the links among web pages, has taken up the majority of our time. This obstacle turns out to be a shortcoming in the fundamental architecture of the web. But without it, however, the web might never have succeeded.

The creator of the web, Tim Berners Lee didn’t provide a means for contextual links in the specification for HTML. The world wide web wasn’t the only idea for networking computers, however. Over 50 years ago an early figure in computing had a different vision of the web — a vision that would have made the construction of NewsLynx a lot easier today, if not completely unnecessary.

Around 1960, a man named Ted Nelson came up with an idea for a structure of linking pieces of information in a two-way fashion. Whereas links on the web today just point one way — to the place you want to go — pages on Nelson’s internet would have a “What links here?” capability so would know all the websites that point to your page.

And if you were dreaming up the ideal information web, this structure makes complete sense: why not make the most connections possible? As Borges writes, “I thought of a labyrinth of labyrinths, of one sinuous spreading labyrinth that would encompass the past and the future and in some way involve the stars.”

Nelson called his project Xanadu, but it had the misfortune of being both extremely ahead of its time and incredibly late to the game. Project Xanadu’s first and somewhat cryptic release debuted this year: over 50 years after it was first conceived.

In the mean time, Berners-Lee put forward HTML with its one-way links, in the early 90s and it took off into what we know today. And one of the reasons for the web’s success is its extremely informal, ad-hoc functionality: anyone can put up an HTML page and without hooking into or caring about a more elaborate system. Compared to Xanadu, what we use today is the quick and dirty implementation of a potentially much richer and also much harder to maintain ecosystem.

Two-way linking would make not only impact research easier but also a number of other problems on the web. In his latest book “Who Owns the Future?”, Jaron Lanier discusses two-way linking as a potential solution to copyright infringement and a host of other web maladies. His logic is that if you could always know who is linking where, then you could create a system of micropayments to make sure authors get proper credit. His idea has its own caveats, but it shows the systems that two-way linking might enable. Chapter Seven of Lanier’s book discusses some of the other reasons Nelson’s idea never took off.

The desire for two-way links has not gone away, however. In fact, the *lack* of two-way links is an interesting lens through which to view the current tech environment. By creating a central server that catalogs and makes sense of the one-way web, Google’s adds value with its ability to make the internet seem more like Project Xanadu. If two-way links existed, you wouldn’t need all of the features of Google Analytics. People could implement their own search engines with their own page rank algorithms based on publicly available citation information.

The inefficiency of one-way links left a hole at the center of the web for a powerful player to step in and play librarian. As a result, if you want to know how your content lives online, you have to go shopping for analytics. To effectively monitor the life of an article, newsrooms currently use a host of services from trackbacks and Google Alerts to Twitter searches and ad hoc scanning. Short link services break web links even further. Instead of one canonical URL for a page, you can have a bit.ly, t.co, j.mp or thousands of other custom domains.

NewsLynx doesn’t have the power of Google. But, we have been working on a core feature that would leverage Google features and other two-way link surfacing techniques to make monitoring the life of an article much easier: we’re calling them “recipes”, for now (#branding suggestions welcome). In NewsLynx, you’ll add these “recipes” to the system and it will alert you of all pending mentions in one filterable display. If a citation is important, you can assign it to an article or onto your organization more generally. We also have a few built-in recipes to get you started.

We’re excited to get this tool into the hands of news sites and see how it helps them better understand their place in the world wide web. As we prepare to launch the platform in the next month or so, check back here for any updates.

design-workshop.com.ua/

homemagazin.com.ua

кондиционеры gree

--

--

Tow Center
Tow Center

Center for Digital Journalism at Columbia Graduate School of Journalism