How Google Is Taking Search Outside the Box

The I/O conference had virtual reality, photos and electronic clothes. But as always, there was search.

Backchannel
Published in
9 min readJun 5, 2015

--

Every year Google holds an event for its developers called I/O. As any geek knows, that event gets its name from a computer term referring to input and output. To quote Wikipedia, it means “the communication between an information processing system (such as a computer) and the outside world, possibly a human or another information processing system.”

This is kind of an antiquated distinction, because the boundary between computers and “the outside world” has eroded dramatically, to the point where it’s often hard to identify the line at all. Virtually every announcement during last week’s I/O conference, attended by thousands of rabid engineers who began cheering at the term “app permissions” and never stopped, presented evidence of this. A typical product was something called Weave, a computer language that powers Brillo, Google’s operating system for the coming explosion of gadgets known as the Internet of Things.

Speaking of weaving, Google also announced a project where computing would be sewn into our clothes. Other announcements dealt with virtual reality and phones that you can put together yourself, as if constructed from a Lego kit.

Less discussed during the conference were Google’s announcements regarding search. You’d be forgiven for not noticing. In recent years, coverage of the company has focused on its more exotic ventures: Internet balloons, self-driving cars and that thing called Google Glass. But search is still the heart of Google, even though the division that once went by that name is now called “Knowledge.” This reflects an evolution of Google search from something that pointed users to relevant websites to an all-knowing digital oracle that often provides answers to questions instantly (or sooner!) from a vast corpus of information called the Knowledge Graph.

The whole ball of wax is threatened by the fact that the I/O of billions of users is now centered on mobile devices. In the last few years Google has worked hard to reinvent its search activities to reflect this. (You can read about it here.) The recent announcements have pushed this harder.

For instance, Google now says that it has expanded its app indexing program to Apple’s iOS platform. “App indexing” is the practice of Hoovering up the data that lives inside apps, the first step to making that information available by Google searching — it’s analogous to crawling the web. Google has been doing this since 2013 for Android apps, essentially creating an index that lives on a simulation of a giant Android phone. And I do mean giant: there are 50 billion deep links indexed so far. (Deep links are those which take you directly to relevant information inside an app, as opposed to leading you to the front door.)

It’s hard to overestimate how important this effort is. Here is Scott Huffman, a Google VP of engineering, on the effort:

Google's goal is the same as what Google's goal always is, which is people put up information and functionality wherever they wanna put it and in whatever form it is and we want to help users find it and get to use it wherever it is. So in some sense the way we're thinking about apps isn't really that different from a few years ago when we added maps and pictures and videos and stuff to our page. We said, Look, if people say “pictures of kittens,” like, they want some pictures of kittens and we should get them the pictures. We shouldn't just give them these blue link things. This is really the same thing. Today when someone says on their phone “SF Giants,” sure, maybe they want the web page for SF Giants, but if they have the ESPN app or the Bleacher Report app then there's great things inside those apps for users to find and we want to lead them to those things.

But app indexing is not just Google introducing another corpus into its search engine. The mobile app-sphere is where people live these days — not so much the web. Google must be there. Huffman knows this. “Google should be the premiere place in user’s minds for finding apps, discovering great apps and finding the content and the capabilities inside of those apps,” he says.

Scott Huffman

The company faces challenges in doing this. For one thing, it had to figure out how to rank apps in search results. Google has endless experience ranking websites, but it has had to come up with new signals to identify the apps most likely to have the best information. (Apps with lots of downloads and high user rankings are more likely to have better information, and Google ranks the deep links within those apps more highly.)

Another potential hurdle is getting total buy-in from developers, who must not only allow Google to scrape their content, but actually do some work to make their apps integrate fully into Google’s scheme. This seems like a no brainer. After all, if the data in your app surfaces in a Google search result, users are more likely to use that app. What’s more, Google has started to give results from apps that are not installed on a user’s device. For instance, if you are searching for a recipe, Google might give you a deep link to a cooking app you don’t have. In those cases, there’s an opportunity to download the app. “So we actually are kind of promoting your app in line,” says Huffman.

Many developers will tell you that getting people to download their apps is their hardest problem: they pay huge sums to Facebook and Twitter to place ads with download buttons. So it seems developers would ford the world’s fiercest rivers to get into that Google index, exposing the apps to new users just when they need those apps.

“That’s what we think,” says Huffman. Nonetheless some developers haven’t got on board. Huffman says at one point Google made “a target list of a few hundred apps.” It asked every one of those developers to get with the program. “About half have done some work,” he says. (Google later specified “more than half.”) So while Google has indexed cooperating developers like eBay and WalMart, there’s no index of the Amazon app. (When I asked him about iTunes, Huffman replied, “We haven’t looked at that one yet. It would be interesting.”) But plenty of other developers have said yes, and since Google automated the process, thousands of Android app-makers have volunteered — those 50 billion deep links are the evidence.

Now Google is starting the process with iOS by indexing a “handful of apps” (as its blog puts it) that include Eat24, Free Dictionary, Huffington Post, OpenTable, Pinterest, SeatGeek, Slideshare, Tapatalk, Yellow Pages, YouTube and Zillow.

Asked for reasons why every single developer isn’t rushing to cooperate, Huffman says that the program is still new, and it does require work from often overloaded teams. “We show up and say please index your app,” he says. “And they say, Ohh, we never thought of that, please leave the room so we can argue with each other about whether we actually want our users to go to the mobile web…”

Mainly, he thinks they just need time to understand it.

App indexing wasn’t the biggest search news at I/O. That distinction goes to a new product called Google Now On Tap.

It comes from the team behind what is arguably the most interesting service within Google’s Search/Knowledge group: Google Now. This is Google’s opt-in service that analyzes the information in its public indexes, its Knowledge Graph and your own personal data (like Gmail, Calendar and your search history) to deliver unbidden the stuff you need to know. It’s also the embodiment of a mobile search product, because it’s voice-based and makes maximum use of location.

Google Now is the company’s key weapon in making search ubiquitous, not just an activity that you practice when you “go” to Google. Aparna Chennapragada, who has been with Now since its early days, explains that the goal, in part, is to answer “implicit queries.” She says, “We should help you not just when you’re asking, we should proactively work for you all the time.” As of this month, Google Now now provides “cards,” which show up just when you need them, without asking, for 110 apps, up from 40 earlier this year. For example, when you get to the airport, you may get a card for an Uber or Lyft ride. Or if you’re in the market for houses — Google will know this, of course — a Zillow card showing open houses may appear on your phone as you cruise your desired new neighborhood on a Sunday. Chennapragada says that eventually Google Now will open up its API and have thousands of developers included.

Three steps to an answer: (1) holding and tapping the phone while “Blurryface” plays on Spotify and (2) asking Google who’s singing lead, will (3) surface the frontman of Twenty One Pilots. Note that at no point is the Google app or a browser involved.

The new twist to Google Now, called Google Now On Tap, builds on that philosophy in a somewhat orthogonal manner. It extends the kind of interaction users have with Google Now — a conversational pitter-patter that allows you to hone your requests without typing — to virtually any app. In other words, you don’t need to be within a Google app, or an app that contains the Google search field, to make a search, or ask for more information. Chennapragada provides an example: you are in Spotify listening to Diana Ross, and you wonder when Ms. Ross got married. (Presumably scaling a mountain that wasn’t high enough to prevent such a nuptual.) By tapping the home button and holding it down, you can ask, by your voice and not your thumbs, “When did she get married?” Google Now On Tap will understand who “she” is, comb through its knowledge and provide you with the dates of multiple marriages. It does this by analyzing what’s on the screen and then doing a good old Google search. (Worth noting: this approach, to be introduced on the upcoming M version of Android, requires the product to dig into the Android operating system. In order to work on iPhones and iOS, it may entail the unlikely cooperation of Apple for an iOS implementation. Good luck with that.)

Chennapragada explains Google Now On Tap.

Chennapragada says that Google Now On Tap works even better when it combines with app indexing — in those cases, she says, “the idea is that we give you some information about the things you’re looking at, but also where you’re likely to go next.” For instance, if you are asking for information about the Ferry Building in San Francisco, Google Now On Tap can take you straight to an app like Foursquare, without you having to painstakingly navigate several screens to find your way to that app and then open it. Another example would be if you hit and held the home button on a Gmail message when someone suggested a dinner date at a good Mexican place: Google Now On Tap might well deliver you an Open Table reservation at a fine taqueria.

“It’s kind of, like, not the usual Google Now. It’s not even the usual Google Search,” says Chennapragada of Google Now On Tap. “This is saying, look, how do you break Google out of the search box? Why can’t you take Google with you? Because you are spending time in all of these apps, why can’t you just put the power of Google to work for you when you need it and then forget it when you don’t need it?”

Liberated from the search box, Google wants to be your constant companion, ready with a search result whenever you ask and even when you don’t. You know that phone you’re holding in your hand? It’s actually a search field. That is Google’s view of input and output in the second decade of the 21st century. As easy as breathing.

Photos courtesy of Google
Follow Backchannel: Twitter | Facebook

--

--

Backchannel

Writing for Wired, Used to edit Backchannel here. Just wrote Facebook: The Inside Story.