How Apple does camera

Rene Ritchie
6 min readAug 30, 2016

When you walk into a big box retailer, you find a bunch of television sets all competing for your attention and your money. To do that, the screens are often set in a display mode that crushes the blacks and boosts the saturation beyond anything you’d really appreciate, day in and day out, in your living room. It’s something that any expert, and numerous internet forums, will tell you how to turn off immediately. Luckily, almost all televisions let you do just that.

Cameras—and many phones are as much camera these days as phones—are a little different. Seldom are they all laid out competing for your attention. More often they’re optimized for the display technology on the device and tweaked for what the manufacturer thinks most customers will find most appealing. What that is, however, can vary. Sometimes widely. And it’s not always as easy to roll back as simply turning off a showroom profile.

The iPhone 7 may take a leap forward when it comes to pocket photography, with rumors of a dual-lens assembly, better low-light, and an event invitation that hints at significant depth-of field effects.

We’ll get all the details next Wednesday, but there’s one aspect we can take for granted right now: The type of photography Apple is going to focus on.

Balancing acts

A few years ago I was on a podcast with an optical engineer and smartphone reviewer and we were discussing cameras. What he said stuck with me — that given the wide range of options when it came to everything from pixel count to pixel size, from aperture to image signal processor, Apple was making smart choices and achieving the best balance possible.

That’s proven to be true over time. We’ve seen cameras with too many pixels and too few, with the distortions of angles too wide and image signal processors too aggressive.

Apple, though, has steadfastly stuck with balance.

Rather than going for the highest megapixel number, even if it means shredding the pixel size, rather than over saturating and over sharpening, boosting shadows and exposure — rather than going for the hyper-real — Apple is obsessed with focusing on the really real.

As shown on 60 Minutes, Apple has a team of several hundred people working on the camera system. They look at hundreds of thousands of images of every scene type, in every scenario, to ensure everything from the sensor to the processor to the software is making the right decision every step of the way.

The goal is to capture an image as true-to-life as possible, with colors as natural as possible, and to make sure it looks accurate not just on the iPhone’s display, but on your friend’s or family member’s phone, on a computer’s display or a television’s display, and on any prints you may choose to make.

If you want to apply effects, if you want to crush the blacks or boost the sat, Apple believes that should be your choice, not theirs. You should be able to add to it freely and not have to worry about taking it away.

Renowned iPhone photographer and co-founder of the acclaimed Camera+ app, Lisa Bettany, shared the following:

The iPhone camera picture quality has dramatically improved over the past nine iterations. It’s approaching a good enough quality that photographers can now use an iPhone as working camera, instead of a tool to simply capture behind the scenes action.

The most recent versions of the iPhone have added a lot of catchy things like “deep trench isolation” to make clearer and more vibrant images and they have succeeded. Images are more true to life now.

According to Joshua Ho, senior mobile editor at Anandtech, there’s a lot going into that:

Apple is clearly focusing on a balanced camera. It’s one thing to chase numbers but in cameras nothing is free. It’s very easy for manufacturers to “game” things like megapixel counts. I’m not going to name any names here, but the examples I’m thinking of tend to have almost a third of the photo out of focus relative to the center. Likewise, by increasing aperture, you’re going to inevitably increase issues with chromatic aberration and other forms of distortion.

Apple tends to avoid playing these marketing games. Across the board, Apple’s camera consistently produces images with natural post-processing, relatively high color accuracy, and competitive levels of detail. Other manufacturers definitely beat Apple in some places, but tend to fall short in areas like image processing or other areas where attention to detail is critical to good photos and videos, and good user experience.

After effects

Apple is in the enviable position of both making their own chips, the A-series, and being able to deploy those chips in every phone, in every region. That leads to remarkable control of the complete pipeline, and remarkable consistency in results across the board.

Some companies put bigger lenses up front. Some put entire server farms at the end. Apple wants to take what it can capture up front and give you the best result possible before it gets uploaded anywhere, if it gets uploaded anywhere at all.

If you choose to, you can add filters, effects, and tags from a variety of amazing third party apps. You can even add external lenses like wide angle, zoom, fisheye, and more. That lets you get the best of all worlds — a natural, true-to-life image that you can modify if and as desired. Or not. It’s entirely up to you.

When it makes sense and doesn’t introduced disruptive complexity, Apple does bring new features like easy-to-take panoramas, time-lapses, iCloud Photo Library, manual controls, Live Photos, and most recently, off-line object recognition through artificial intelligence and machine learning.

As technology matures, and Apple can maintain their balance and focus on naturalness, they increase megapixels or pixel size, widen the aperture … even add second lens. If any of those things cause inconsistency or distortions, even if the numbers look great on a spec comparison, Apple waits.

Shots on shots

No one camera, like no one phone, will appeal to everyone, and it’s important that we have many different companies trying many different approaches. That’s what gives customers options and pushes everyone to do better.

There are things I’d love to see Apple add to the iPhone camera. Silly as it sounds, I would love the ability to automagically save Live Photos, Bursts, or slide shows as videos or animated GIFs. Because, fun.

Lisa would like to see a way to further reduce or eliminate the blocky pixelation she still finds in images, especially in skin tones. Also:

The one major hurdle of the iPhone camera has always been the fixed aperture which hinders our creative control. I’d be thrilled to see adjustable aperture in the next iPhone version.

Josh, clearly trolling, thinks Apple could elect to just go with a bigger camera hump. More likely, though:

Apple is pretty close to the limit of what’s possible with a single conventional Bayer CMOS camera. To really push the envelope I’d be interested in seeing systems with dual cameras, new color filter arrays, and other emerging technologies.

One area where Apple is already doing exemplary work is accessibility. Where some might simply assume the blind wouldn’t need a camera, Apple realized everyone has family and friends they may want to share photos with. So, Apple made the Camera and Photos apps accessible to the blind as well. I’m curious how much further Apple can take that?

Keeping focus

At next week’s special event, Apple will announce its best iPhone camera ever. You can say that every year, but rumor has it this year will be something extra special.

Regardless, the end result is still going to be what matters most — that we’re able to capture the memories and moments we want to capture, accurately and easily, and in a way that we can save and share them quickly whenever and with whomever you want.

As long as Apple stays true to that goal, whatever else the company may add, the camera won’t lose focus.

--

--