Deep Links Won’t Solve Dark Matter

Search engines solve a fundamental problem for humanity: How to find the information that we are looking for. Almost everything is accessible because of the ubiquity of Search, and the universal searchability of information. Tell Google what you want, and Google will find it for you. Search engines have been effective in delivering on this promise, which is why for the last 15 years, most online content was designed to be found by a search engine.

This paradigm is being challenged today. Search engines are no longer the primary portal to the rest of the web. An increasing share of our information is no longer found after being searched for — it is more likely to be recommended by a friend or “discovered” in a feed. And it’s more likely that that recommendation or discovery is happening on a smartphone than a computer. The rise of the mobile phone and social networks have placed enormous amount of information inside mobile apps, social networks, and modern web architectures, where it cannot be parsed by search engines — and is therefore beyond our reach. This information is collectively called “dark matter,” and it is metastasizing rapidly. Dark matter threatens to take information off the open web and lock it inside private internet fiefdoms. It could spell the end for search, and for the internet as we know it.

Deep linking has been promoted as a fix for dark matter. A technology that has existed for more than a decade, deep linking has become the primary workaround that allows search engines like Google to index content locked inside of a native app, a game, or a social network.

Deep linking is indeed Google’s own solution to dark matter. It’s own new “post-app” product, Android Instant Apps, communicate with the Google search engine using deep links. With Android Instant Apps, developers can send users to any part of their app via a web link, even if they haven’t downloaded the app (The feature is only available on Android devices.) More recently, Snapchat announced plans to offer deep linking as well. Both those moves came after Apple started offering deep linking in 2015.

The trouble is that deep linking is a flawed and inefficient solution to the existential problem of dark matter. It is a band-aid on an open wound, and one may which ultimately exacerbate the problem.

Deep links essentially create a “Cliff’s Notes” version of a site that’s indexable by search engines. In other words, they work much like metatags for websites in the early version of the web. Back in those days second-tier search engines like Excite and Lycos saved money by categorizing sites based on those metatags — descriptions in the header of an HTML file — instead of the webpage itself. That meant for all practical purposes that a nefarious website creator could provide “Local Georgia Barbecue” as a descriptor for a porn site.

Like metatags, deep links violate what’s known as the “Single Source of Truth,” the practice of constructing information models so that every data element is stored once. Deep linking runs contrary to this principle because it relies on a separate representation of the original source, the content in the app. This is why reputable search engines like Google eschewed this approach, and why those that did beat out competitors like Lycos and Excite. Deep links are more elaborate workaround than elegant solution.

Why does this matter? While the internet’s linking structure is standardized, deep linking has no standard. Every app can structure deep links however it wants, and any search provider can choose which deep link formats it will accept. So, just as dark matter is dangerous because it opens up the possibility of information suppression by Facebook and Twitter, deep linking gives more authority to Google, Apple and Facebook, which each promote competing deep link solutions.

Search engines need to know how to interpret deep links within an app. The corporations that control search engines and similar ecosystems (say, an app suite with search) ultimately determine what to do with different types of deep links. Each type of deep link requires application-specific support. If a search engine decides to optimize only for deep links for a certain set of apps, those deep links will be better indexed. Active effort is required to support each type of app-specific deep link.

For instance, a link into an app store (say, iTunes) is one type of deep link. You need iTunes to open that link, as the web browser doesn’t “speak” iTunes. So, in order for anything inside that link to be indexed, the search engine must be able to “speak” iTunes. In other words, without explicit support, the search engine cannot see what’s behind the link.

What makes the web special is that it’s a common system for distributing content with links. No one controls it, even though of course companies like Google have amassed a lot of power by serving as intermediaries. In contrast, competing proprietary systems for deep links will hand more power over to the handful of companies that already provide the infrastructure for mobile computing.

The assumption that the entirety of human knowledge will be eternally accessible on Google is no longer a sure thing. In fact, Google is sparring with Apple over deep linking technology for apps. Since both have a vested interest in furthering their own ecosystem, it’s possible that no industry standard will emerge, and we’ll have competing solutions a la Google Maps and Apple Maps.

In a weird way, this mirrors our current political climate in which each side clings to their version of the truth. But there’s only one real truth, right? Deep links won’t solve the problem of app discoverability, they’ll just make app discoverability a branding exercise.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.