Photo by Bettina Crawford

What if your apple could talk?

Danny DeRuntz
IDEO CoLab Ventures
3 min readOct 3, 2016

--

When you walk into a grocery store and go to select an apple — a seemingly simple, everyday task — consider what you know. You see a sign that says McIntosh, a price of $1.49/lb, and a label showing its brand. You feel firm flesh, see it’s a vibrant red, and smell that it’s sweet, but probably not overripe. And yet, there’s far more that you don’t know.

You don’t know if it’s actually a McIntosh. You don’t know how long it will last, or how long it’s been stored. You don’t know how much fiber or sugar lies within it. You don’t know whether it still has any of the beneficial nutrients it had when it was picked from the tree. You don’t know how far it traveled or whether it was grown on a farm that used pesticides or organic fertilizers. In reality, you take most of the information you actually might really want for granted — we all do.

At IDEO CoLab, we’re interested in fundamental human challenges like understanding our food, and how emerging technologies can help solve them. Recently, we’ve been exploring virtual and augmented reality, and we’ve seen that these technologies give rise to an entirely new set of senses we didn’t have before. When it comes to food — the stuff that fuels our body — this has powerful implications. This technology allows data to be presented when relevant and when requested; data that was tracked long ago as well as data that’s captured and shared in real-time. There’s already a tremendous amount of data created throughout the supply chain of most products that we consume, including food, and these datasets are only growing with more smart sensors out in the world. Very little of this data makes its way to consumers, though.

Pickl started as a concept in the Food + Future CoLab earlier this year, demonstrating a shopping cart that could highlight only the foods in the store that individuals would want based upon personal profiles. Over the last month, we’ve dug in deeper to show what augmented reality could do for a simple, specific task: shopping for an apple.

As you put on the Microsoft HoloLens headset, our prototype shows an apple deconstructed into its component parts: its nutrient contents, its geographic journey from tree to store, the energy used to produce it, the dollars and cents passing hands throughout its journey. It’s a vast collection of information that shows possible new data points individuals could value.

Shot from the Hololens

The reality is that the information about our food today is incredibly sparse. It’s the information that we never considered that we’ll find to be most valuable. Some people will want facts about origin; some will want to know vitamin levels; others will want to know the best apples in the bin for baking. These queries haven’t been possible in the past.

There’s an enormous amount of data that needs to be created to make this possible, and that’s no trivial task. Also, while we’re having tremendous fun with the HoloLens, it’s still not realistic to imagine shoppers wearing these in stores. But, a critical part of this process is first finding out what people find most valuable. In the spirit of building to learn, Pickl is a starting point. It lets people experience data they never have before and asks the question “what if” through a real experience. We’re excited to learn what apples really are and how new technologies like augmented reality can help us get closer to really understanding our food.

Clips that illustrate the amount of data becoming accessible by technology like Illuminate, and how such data will power a distilled reality, subtracting irrelevant choices

We’ll be sharing the prototype described above at SXSL today alongside our friends from Target and the MIT Media Lab as part of our ongoing Food + Future collaboration to explore how emerging technology can help us understand our food better.

Co-written with Matt Weiss

--

--