Stress-testing iPhone 7 Plus’ Portrait Mode

As soon as the 7 Plus was announced with its dual lenses and depth effect (Portrait Mode), I knew I was going to have to upgrade from my normal-sized iPhone to the big guy. I love taking photos, and I own a dSLR, but I, like many others, find myself taking all of my photos on my phone. “The best camera is the one you have with you,” etc. etc.

A photo of my friend’s dog, Stan, before I left. (Notice one blur error in upper left ear area)

Luckily, a week or so after the 7 was released, I had a trip to Japan planned. This was most definitely a great place to test out what the phone had to offer. There are many more photos where this came from, but I will say I edit all of them using VSCO.

And now that 10.1 is public, I thought I’d showcase a bunch of photos I took using Portrait Mode, and point out its pros and cons.

Human (and Humanlike) Subjects

This is what Apple advertises and, sure enough, does the best at. Faces are crisp, background are blurred—the whole dSLR-like effect. The one thing it can snag on occasionally is hair wisps. That’s pretty understandable—anybody who’s tried to isolate a person in Photoshop or similar knows that hair is the biggest pain in the ass. Portrait Mode thinks the same way. It either skips it (and the elements behind the hair are sharp too, no matter the depth) or overdoes it (and blurs the wisps altogether). Sometimes it works perfectly though:

This one worked pretty well: look at the foreground blur on the snack, the crispness of my dad’s face, then the background (and his hair wisps too!). This looks like a dSLR shot to me. One small thing: left side of his face has a small blur error.
A little blurry, but that’s my fault: clearly I was moving a bit and this is a shutter speed issue, not depth. But you can see some errors around the edges of the umbrella.
This one’s missing a little bit of the hair wisps into the blur, but generally doing a solid job. That’s Kyoto in the background there, and it’s blurred to the max!
Some minor errors around the hair wisps again and the straight strap of the purse, but generally okay.

Non-human Subjects

More hit than miss, I’m continually impressed by how the phone decides to choose what to focus on and what not to in Portrait Mode.

This has a lot going on (and no faces), yet the depth effect seems to have known what to focus on. Solid!
No humans in this shot! Still worked pretty well.
A collection of vending machine drinks. Some errors here and there on this one.

Handling Different Depths

I’m sure there’s a word to describe what I’m calling “stepping”—where different depths of focus are clearly defined with more or less bokeh depending on how far from the focal point the photo is—but the 7 deals with this surprisingly well. The photos below show it:

This one really shows off the stepping of the blur effect: foreground completely crisp, background pup a little blurred, background setting completely blurred. Impressive!
It’s melons (blurrier and blurrier) all the way down.
Another stepping example. Juliana: perfectly crisp. Woman behind her: pretty out of focus. Guys behind that woman, even more blurred, then even more on the back. If you look closely though, her hair wisps are crisping the back of the train car.


While a lot is super good, there are a couple areas where the Portrait Mode just ain’t gonna cut it. Complicated foreground and background combinations seemingly overwhelm it and the blur edges get confused throughout the photo.

Portrait Mode doesn’t even begin to successfully work with translucent or shiny objects. Though this is pretty understandable considering the hardware — attempting to detect depth on objects like these is probably difficult. Not sure if Apple will ever be able to get over this one unless their machine learning becomes near-perfect.

This fruit tree’s complicated foreground elements mixed with the overhead train wiring don’t blend well here.
Remember the picture of the monkey I showed you earlier? Well, here it is with a rope. So close to being great, but look at that disappearing rope. And the rope’s blur errors above. But the colors!
The magic disappearing coffee cup.

In Conclusion

Portrait Mode is definitely not perfect! Narrow subjects in the foreground (think straws, tips of dogs’ ears, and individual hairs) are Portrait Mode’s greatest difficulty. Here’s my guess as to why this is:

First off, I’m assuming that the way that the iPhone detects depth is by using stereo photo analysis. Not unlike how VR works, but in reverse, or how Pathfinder’s cameras work.

Then, with this in mind, thin objects near the camera have the same effect as when you bring a finger very close to your eyes — you see double and can’t really make it out well.

Just a poorly-educated guess.

But, on the flip side, the effect is applied quickly, and seems to focus on exactly the plane I want it at. Plus, its “stepping” between objects in the foreground and background is really impressive.

People have asked me what camera I used to take these photos, which is I’d call a good sign. I like how they look, and this is a camera I can keep in my pocket all day as I walk all over a beautiful country. That’s a win for me.

(See a bunch of other photos I took using the iPhone 7 Plus on Exposure)

Note: These are taken using 10.1 beta’s version of Portrait Mode. I haven’t messed around with the public version yet (but the feature is still considered a beta).

You can also follow me on Twitter.