Would you eat that? The Economist’s first foray into augmented reality

Using tabletop AR to explore the future of food

Tom Standage
Jul 5, 2018 · 6 min read
Image for post
Image for post

What better way to celebrate the 175th birthday of a Victorian-era newspaper than by producing its first augmented-reality (AR) feature? That was one of the thoughts at the back of our minds as we embarked on an AR project earlier this year. Having seen other news organisation dabble with the technology, we wanted to take a closer look at it ourselves. Our New Product Development unit in New York duly teamed up with two digital designers with experience of AR, Michelle Hessel and Nitzan Bartov. Their mission: to evaluate the opportunities in AR, figure out how we might use it in our journalism and develop some prototypes.

Image for post
Image for post
What the future might look like, according to Magic Leap

Unlike virtual reality (VR), which transports you to another world, AR makes virtual, computer-generated objects appear in the real world. The best known example is Pokémon Go, a game in which players pursue virtual monsters that are drawn over a smartphone’s-eye view of the world. Smartphone-based AR is generally considered a mere stepping-stone towards a world in which smart glasses, like those being developed by Magic Leap, Microsoft and others, can seamlessly paint labels, data and objects into the wearer’s field of vision. AR headsets are still in their infancy, however. They are clunky and expensive, much like the giant mobile phones of the 1980s. So for the time being the focus is on smartphone-based AR. Publishers including the New York Times, Quartz and the BBC have already produced AR features: the first two added AR into their existing apps, and the BBC created an impressive standalone AR app to accompany its “Civilisations” series.

Having settled on smartphone-based AR, we began to explore possible topics and ideas. Being a global publisher, we wanted to create something that would work anywhere, which ruled out the possibility of making a “site specific” AR piece (such as a visualisation of what London or New York would look like if sea-levels rise, for example). We wondered about building an AR app that could identify satellites as they cross the sky, or make space junk visible. Or could we build an AR experience that shows what drone-filled skies might look like in future? Another area we thought about was data visualisation, one of The Economist’s traditional strengths. Could we use AR to present data around a particular topic? Or should we create a board game, with the board printed in the newspaper, but with virtual counters and other objects projected onto it?

After an initial exploration we ended up focusing on three topics: the rise of space junk, and what to do about it; what the Earth would be like if the Moon had never existed; and the future of food. Our aim was to create some prototypes in the hope of launching something in July alongside our annual supplement that imagines future scenarios, “The World If”.

Image for post
Image for post

For the first topic, we started by creating a detailed model of the Earth that could be made to hover just above the floor, when viewed through a smartphone. We used this as the basis for a 3D infographic about space junk, showing the steady build-up of orbital debris over the past 60 years with a timeline slider. By walking up to hotspots in the debris cloud, the user could trigger animations and commentary explaining various ways the junk might be cleaned up. For the “Earth with no Moon” idea, which is one of the ideas explored in “The World If” this year, we created a model of the Moon with hotspots that, when triggered, display information or play music. Had the Earth’s lunar companion not existed, for example, Frank Sinatra would never have sung “Fly Me to the Moon”. (Triggering actions simply by moving around, or by proximity to hotspots, rather than explicitly tapping things, is one of the emerging user-interface conventions for AR.)

Image for post
Image for post

At the same time, we began to explore the idea of food in augmented reality. One style of AR is known as “tabletop AR” because it involves projecting a scene, such as the battlefield for a video game, onto a table. By moving around the table, players can see the action from different angles, leaning in to get a closer look. But we wondered whether tabletop AR could be taken more literally: what better thing to project onto a table than the unusual foods we might find ourselves eating in 2050 in a warmer, more crowded world? The future of food is a very Economist topic: it’s global, it connects intensely personal choices with wider political, economic and environmental concerns, and it’s forward-looking. And given that we were founded in 1843 to campaign against the protectionist Corn Laws, it’s also in our wheelhouse.

As we started to investigate the feasibility of scanning futuristic foods in 3D, we made contact with Kabaq, a startup based in New York that specialises in combining food with AR to create detailed 3D models. (Its app, Kabaq AR Food, lets you preview foods from the menus of various New York eateries before ordering.) Its founders turned out to be Economist fans, and were keen to collaborate. They also suggested a way we could deploy our AR foods without requiring readers to download a special app: in the form of Snapchat “lenses”. Given the cost and complexity of modifying our existing apps, or creating and maintaining a new one, this approach had obvious appeal. And Snapchat lenses turn out to be capable of doing more than just plonking a 3D model into the world: they also allow for audio and limited interaction. That meant we could add audio commentary and pop-up infographics for each food, explaining its pros and cons.

By the beginning of April we had decided what we wanted to build: a handful of Snapchat lenses depicting various futuristic foods (edible insects, artificial meat, algae, meal-replacement foods and 3D-printed food) and showing information about their pros and cons. All this would allow readers to examine the foods up close and decide which of them they find most palatable. Together with Kabaq, we created two lenses during April (artificial meat and edible insects) and showed them at the Augmented World Expo, a big AR/VR fair in San Francisco. AR can’t capture taste or smell, so we gave away insect-protein bars for people interested in actually eating them.

Image for post
Image for post
Yum, insects!

We’ve now completed all five lenses, which meant working with Kabaq to get hold of the various foods, scan them in 3D and build the associated infographics together with our data team. Natural Machines, maker of the Foodini 3D printer, agreed to print something for one of our photo-shoots.

You can see the results, and find out more about the various foodstuffs, on this page. (If you’re reading this on a smartphone and have the Snapchat app, tap here to see the “Edible Insects” lens.) Check out all five, and share an image with the hashtag #econfood once you’ve decided which of these options for the future of food you find most appetising. Tuck in!

Severe Contest

Insights from The Economist’s digital playbook

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store