Apple’s dual camera phone will change photography as we know it.. and here’s why:
Most of the chatter around Apple’s September 7th iPhone event has been about the removal of the headphone jack. What I’m focused on (no pun intended) is the dual camera system which will change photography forever. If this sounds like another incremental improvement that won’t change much, bear with me here.
Up until now, the main difference between a camera with a proper lens (like a digital SLR, rangefinder, or compact camera) and the tiny flat ones that you would find in any smartphone is a feature of a photo called depth of field (referred to as DOF from here). An image taken with a tight DOF creates a more dramatic image. The DOF effect is created with the aperture on a conventional camera, The aperture controls the amount of light that comes through the lens to the sensor. A secondary effect of opening and closing the aperture is the amount of focus field you will create. Open the aperture and let more light in and you shorten the focus field. Close the aperture and let less light in, you lengthen the focus field. Below is an example of an image with low aperture setting. The subject of the photo is a microphone — and the background drums are out of focus. Our brains are programmed to look to a focused object to understand what the subject of the photo is. Generally a glass lens on a camera body that has some distance from the sensor is able to create this DOF because you focus on a specific point and the physics behind the glass will blur objects in front and behind that object. A photo like this one is impossible to take with a camera phone with a single flat lens:
if your lens if flat and right in front of the sensor like that of a camera phone lens, you don’t have distance information since the focus is always set to infinity (technical details of photography are beyond the scope of this medium post.. if interested in more watch this video). This next photo is more like what would be created from a camera phone. Most of the image is in focus and there is little depth or drama to the image.
A flat lens right in front of a sensor doesn’t optically produce depth of field. I’m using this image in an illustrative way. I actually have no idea what camera was used to shoot this but I hope you get my point — ie: that you will notice this image isn’t as dramatic as the first. Today’s camera phones don’t have the ability to measure distance so they can’t digitally re-create the DOF drama that a conventional lens does on it’s own.
There is no correct way to shoot a photo. Sometimes you don’t need DOF. But not being able to produce DOF has been a huge issue with camera phones — and it’s the reason I still carry around my Leica Rangefinder from time to time! It’s also the reason why professional photographers often need some sort of lens to do their work. A camera phone can’t do something like this shot with my Leica:
Instead your shots will always look flat like this shot with my iPhone:
Until tomorrow that is.
Just like our two eyes can be used to detect depth, two lenses can do the same thing. By using the disparity of pixels between two lenses, the camera processor can figure out how far away parts of the image are. There are papers out there that go into depth about how this method works if you are interested.
The magic is how software takes the information from the two lenses and processes it into an image. Between the extra data we can get from this new hardware, and the machine vision work currently going on, the results are going to be incredible. Depth of Field is one of the last features needed to complete the full migration from handheld camera to camera phone. Soon both amateur and professional photographers will only need to carry their mobile devices.
This isn’t the first time a camera manufacturer has put a dual camera system into a camera phone — but with Apple’s software app ecosystem behind it, I believe we will change photography forever starting September 7th.
special thanks to Ben Basche for his proofing/editing help.