The Fire Phone at the farmers market

Can it see kale?


The first half of this tab is going to sound somewhat negative: the familiar stance of a humanistic critic unimpressed by the latest whiz-bang technology because it falls short in some essential way. The tab will however quickly resolve into geeky enthusiasm. Please don’t miss the enthusiasm.

The most provocative feature in Amazon’s new Fire Phone, by a wide margin, is something the company calls Firefly. The premise is potent: Point your phone at something, anything, and the phone will recognize it.

https://twitter.com/fmanjoo/status/479325524175892480

Among the “items” used as examples in the Fire Phone’s unveiling, there were: a book, a song, a TV show, and a jar of Nutella.

This is, on one hand, quite magical, and on the other hand, totally depressing. Navneet Alang crystallizes the humanistic response:

https://twitter.com/navalang/status/479326894391693313

He’s on to something. With the exception of a few paintings, all of Amazon’s demo “items” were commercial products: things with ISBNs, bar codes, and/or spectral signatures. Things with price tags.

We did not see the Fire Phone recognize a eucalyptus tree.

CC-licensed photo from John Morgan: https://www.flickr.com/photos/aidanmorgan/2073661175

There is reason to suspect the Fire Phone cannot identify a goldfinch.

CC-licensed photo from nutmeg66: https://www.flickr.com/photos/rachel_s/3226596712

And I do not think the Fire Phone can tell me which of these “items” is kale.

This last one is the most troubling, because a system that greets a bag of frozen vegetables with a bar code like an old friend but draws a blank on a basket of fresh greens at the farmers market—that’s not just technical. That’s political.

But here’s the thing: The kale is coming.

There’s an iPhone app called Deep Belief, a tech demo from programmer Pete Warden. It’s free. Upon launching the app, you see this message:

The abstraction delights me: “the thing you want to recognize.” It could be a chair; it could be a person; it could be the Eiffel Tower. In this case, let’s address my criticism of Firefly. Let’s teach this app to recognize kale.

Deep Belief guides you through the process. First, you spend about a minute pointing the camera at some kale, sort of orbiting the greens, getting in close, pulling farther away. When the meter at the top fills all the way…

…you begin the next phase. You teach the app what kale isn’t.

This is the fun part of the process, because you just pan the phone wildly, trying to capture as much of the non-kale world as possible. Chairs, people, Eiffel Towers, anything, everything. For just a moment, you perceive the entire universe as composed of kale and non-kale. Is that the sky? No, it is non-kale. Are those the stars? No—they are non-kale.

Whoa.

After you’re done with all that, Deep Belief goes into recognition mode, and you can wave the phone around like a geiger counter, watching the app’s recognition meter rise and fall. Pointing the camera at the kale again, you get a full meter…

…and along with it, a bright pinging sound, like sonar. Ping! That’s the thing! I’m sure of it! Ping! Ping!

When you point the camera at a chair or a person or the Eiffel Tower, the recognition meter drops to zero. More impressively, when you point it at something that looks preeetty similar to kale, the app knows the difference.

The meter drops. No ping.

I should say that Deep Belief is far from perfect, still easily fooled. For example, the app is clearly on the fence about this broccoli.

But remember, this is just a tech demo, and it’s running on a phone, and it received two minutes of training. Remember, these techniques are improving all the time. I think it’s safe to say the broccoli will not be a challenge for long.

If Amazon’s Fire Phone could tell kale from Swiss chard, if it could recognize trees and birds, I think its polarity would flip entirely, and it would become a powerful ally of humanistic values. As it stands, Firefly adds itself to the forces expanding the commercial sphere, encroaching on public space, insisting that anything interesting must have a price tag. But of course, that’s Amazon: They’re in The Goldfinch detection business, not the goldfinch detection business.

If we ever do get a Firefly for all the things without price tags, we’ll probably get it from Google, a company that’s already working hard on computer vision optimized for public space. It’s lovely to imagine one of Google’s self-driving cars roaming around, looking everywhere at once, diligently noting street signs and stop lights… and noting also the trees standing alongside those streets and the birds perched alongside those lights.

Lovely, but not likely.

Maybe the National Park Service needs to get good at this.

At this point, the really deeply humanistic critics are thinking: “Give me a break. You need an app for this? Buy a bird book. Learn the names of trees.” Okay, fine. But, you know what? I have passed so much flora and fauna in my journeys around this fecund neighborhood of mine and wondered: What is that? If I had a humanistic Firefly to tell me, I’d know their names by now.

https://twitter.com/vitor_io/status/479392860756639744

Yep. That would be pretty great.

Here’s Pete Warden’s Deep Belief demo for iPhone. It is weird and powerful and flawed and fun and absolutely worth a few minutes of your day. Point it at something without a price tag.