This morning, I posted a review of the iPhone 7 Plus, which focused on the camera. Later today, I had a chance to install the public beta of iOS 10.1 and play around with the new Portrait Mode / Depth Effect which was demoed at the iPhone launch event but isn’t available in iOS 10 as running on the new phones out of the box. This isn’t an in-depth review, but I’ll share some of my first pictures using the new mode. I’ll come back for some conclusions at the end, but in short it’s pretty impressive and I’m looking forward to using this a lot more.
I installed the beta late in the day, on a gloomy day, and so the lighting wasn’t great. In addition, my main subject for these portraits was my eleven month old son, who doesn’t tend to hold still for very long. As with my earlier review, I’ll also note that again that this feature uses the 56mm lens, and longer lenses always do worse in lower light than shorter phones, so these problems are more acute here than they would be if the feature used the 28mm lens exclusively.
With those as caveats, here are some of my favorite shots from this evening.
As you can see, the feature works as advertised, with the focus automatically locking onto a face when one is present, and doing a good job blurring the background. You will see some motion blur in these shots for the reasons I mentioned above, but otherwise they’re reminiscent of the kind of thing you’d get out of a DSLR.
Because I could only prevail on my son to stay in well-lit spots for so long, I decided to try some stationary objects as well. This also allows for playing around with the depth of field a little more, including focusing on objects in the middle distance. Here are some more sample shots. (We had two birthdays in our family today, so you’ll see more balloons and other decorations than we’d usually have in our house here.)
Those first four are all pretty straightforward pictures, with the focus on the closest object and increasing blurriness behind. By default, Portrait Mode seems to focus on the closest object if there’s no face in the frame, but you can select alternative focal points manually as in the regular camera.
That allows you to focus on objects in the middle distance as in these three shots below.
You see the effect best in the third picture, but it’s working in the other two as well. There might be a few quirks here and there in the relative blurring (and the noise from the low light doesn’t help matters), but in general it’s effective in mimicking what you’d get from a shallow depth of field at small f-stops (larger apertures).
The one time I did see some errors in the software processing was in taking pictures of this balloon. It’s a tough object to focus on because it’s rounded and reflective, and you can see a little weirdness around the edges of the balloon in both the images below.
I’ll keep this brief. I’ve only played with this feature for a couple of hours, in low light, with a somewhat uncooperative subject, and while the feature is officially in beta. However, the promise is already obvious. This feature does what it’s supposed to most of the time, and when it doesn’t it’s not terrible either. No, these pictures aren’t exactly what you’d get out of a DSLR — that was always going to be impossible with such small lenses. But it’s remarkable how reminiscent of DSLR-style pictures these images are. And that’s all about the combined power of the two lenses and some very clever software. I’m really curious to see not just how this feature evolves but also what else Apple’s engineers can cook up along the same lines.