The Fascinating Science Behind Low-Light Photography In Smartphones

Pushing Boundaries in the Pixel 4, Huawei Mate 30 and iPhone 11

Sarvesh Mathi
Nov 1 · 10 min read

In the last decade, smartphones have given point-and-shoot cameras and DSLRs a run for their money. Literally.

Chart showing the relation between the rise of smartphone sales and fall of camera sales in the last decade.
Chart showing the relation between the rise of smartphone sales and fall of camera sales in the last decade.

A quick explanation for the trend you see above: “Why should anyone who wants to take decent pictures suffer through the pain of spending thousands of dollars on a camera system, many hours of learning how to use the camera and how to post-process images from it, when the alternative is to use a small, portable and idiot-proof device that they have in their pockets at all times?”

With a smartphone, everyone has a claim to the label ‘photographer’ — one that was previously reserved for those with big, bulky cameras. While smartphones do hold their own against DSLRs in most aspects, sometimes even beating expensive $5,000 models hands down in areas like videography, the one area where they fared badly was low-light or night photography. That’s until now.

Photography 101: A camera lens takes all the light rays bouncing around and uses glass to redirect them to a single point, creating a sharp image. But by definition in low-light and night scenarios there is, well, less light. This makes low-light photography hard.

There are two ways to deal with this challenge:

  1. using a larger sensor
  2. allowing a longer exposure i.e. keeping the shutter open for a longer time

Both of these allow more light to enter the lens. Smartphones by design are made to be portable, so having larger lenses is undesirable. The second option, taking a long-exposure shot requires the device to be held completely still until the image is captured, which can range from a few seconds to minutes depending on how much light is around. This is impractical when handheld and requires a tripod.

The Google Pixel and iPhone have given more preference to the second option and dealt with the shaking hands by adding some clever computational tricks, while the Huawei flagships have given more weight to the first option, going with a larger sensor.

Google Pixel 4

Google’s revolutionary low-light photography mode in the Pixel range of phones dubbed Night Sight was announced in late 2018. The team behind the Pixel camera published a detailed blog post explaining the science behind it.

Darkness is often measured in lux. It is the amount of light arriving at a surface per unit area. Below is a chart from the Google AI Blog that shows relatable levels of lux:

Chart showing different levels of lux, with the highest being 30,000 lux when a sidewalk is lit by direct sunlight.
Chart showing different levels of lux, with the highest being 30,000 lux when a sidewalk is lit by direct sunlight.
Source: Google AI Blog

Traditionally, smartphone photos begin to deteriorate when under 30 lux, but some phones can still perform down to 3 lux by using features like HDR which captures multiple shots and merges together. Google’s Night Sight targets photos in the range between 0.3 lux to 3 lux. And these photos show just how well Google performs.

iPhone XS with SmartHDR (left), and Pixel 3 with Night Sight (right). Source: Google

The workflow of Night Sight:

When you’re taking a photo, Google will automatically recommend using Night Sight if the scene is below a certain level of lux. You can also manually choose this mode anytime.

In the normal mode, Google employs something knows as zero-shutter-lag (ZSL). As soon as the camera app opens, the phone begins clicking frames even before you press the shutter button. Once you click snap, it sends the most recent 9–15 frames for processing. ZSL, however, requires exposures to be less than 66 milliseconds which is too short for low-light photography. In the Night Sight mode, Google uses a positive-shutter-lag, meaning that the photo is only clicked after the shutter button is pressed. This allows the camera to be generous with exposure times, but it requires the photographer to hold still for a few seconds after pressing the shutter button.

Increasing exposure time, however, comes with its own challenges. The camera is more susceptible to shake or motion in the scene which will result in a blurred image. To combat this, Google keeps track of motions in the scene as well as movements of the hand and decides the number of frames to take and exposure time for each frame accordingly. For example, a stable shot on a tripod will result in 6 frames of 1000ms each for a total shot time of 6 seconds, whereas a handheld shot will result in 15 frames of 70ms each for a total shot time of around 1 second. Depending on the exact level of motion present and the amount of light around, anything in between this range is possible.

High Dynamic Range (HDR) is when a scene has both bright and very dark regions. The phone camera has to make a tradeoff between which to focus on. The HDR processing feature in recent smartphones captures multiple frames which generally include one frame where the bright regions are seen properly and one where the dark regions are seen properly and merges these frames for a better overall image.

A scene with HDR off (left) which focuses on the subject’s face and blows out the sky vs a scene with HDR on (right) which exposes all parts of the photo successfully. Source: Google

In Night Sight mode, once the shutter is pressed and the different frames are captured, they are sent to Google’s legendary HDR processing pipeline. In the older Pixel devices (1 and 2) this is the HDR+ and on the newer devices (3 and 4), this is Super Res Zoom. This is where most of the magic happens. All the frames are averaged to reduce noise and produce a sharper image. Details missing in one frame are drawn from the other, less sharp pixels in one are replaced with sharper pixels from the other. Information on the color of each individual pixel is better when the camera has multiple frames to derive them from.

Once Google processes the image, it then has to adjust the color temperature to make colors appear accurate. The human eye is good at color constancy which allows us to judge the color of an object despite the lighting of the environment. But we cannot judge the colors in a photo the same way. The colors in a photo which was taken under warm light will appear yellowish. Smartphones generally correct for this by shifting colors in a process called auto white balancing (AWB) so that it appears as if neutral, white light was falling on the object. But AWB fails in low-light shots. To overcome this, Google trained its AWB algorithm by hand-correcting the white balance in numerous shots until the algorithm was able to learn to do this by itself.

Night shot without color balancing (left) vs with learning-based AWB (right). The colors in the right image are more accurate representations of the actual colors. Source: Google

The final step in Google’s process involves making the image appear as if it was taken in the night and not during the day. Human eyes consist of rods and cones cells but in low-light only rods function. Rods are terrible at perceiving color as accurately as cones and also have less spatial acuity (sharpness). Night Sight is deceptively good in that it can produce photos with colors and sharpness that even the human eye cannot see. In its extreme, this can make an image look like it was taken during day time and thus, fake. In order to prevent this, Google uses an S-Curve in tone mapping that allows viewers to perceive that the picture was in fact taken during the night or in low-light.

As soon as the photo is captured, if you jump into your cameral roll fast enough, you can see Google’s computational tricks at work as the photo develops into the final picture.

In addition to Night Sight, the Pixel 4 also has an astrophotography mode which allows photographers to capture even darker night time scenes such as stars in the sky by enabling longer-exposure time as long as 16-seconds per frame and as much as 15 frames, for a total shot time of 4 minutes. But this requires the phone to be held against an object or on a tripod.

Huawei Mate 30

Huawei flagships are some of the most underrated phones. Huawei was the first company to introduce a dedicated night mode that actually worked. The P20 Pro could take incredible low-light shots much before the world saw Pixel’s Night Sight. And the Mate 30 series introduced in September this year, takes low-light photography giant leaps ahead. At least in theory.

Earlier I mentioned that smartphones generally do not house bigger sensors because they have a small body. Well, Huawei found a way to do so.

Photo captured on the Huawei Mate 30 Pro (left). Source: Huawei

The Mate 30 has four camera lenses, but the primary lens for low-light performance is the 40MP 1/1.7" sensor.

There are two main features that make this sensor excel in low-light:

Camera sensors capture colors in the scene by using a color filter array that is overlaid on top of an image sensor. The sensors in the iPhone, Google Pixel and just about every other flagship use a red, green, green, blue (RGGB) filter. But the Mate 30 has gone against years of tradition by using a red, yellow, yellow, blue (RYYB) filter. How does this make a difference?

RGGB filter has been used for many years because human vision is sensitive to red, green and blue. By replacing the green with yellow in RYYB filters, the sensor in Mate 30 is able to capture more light. Huawei claims that this allows the sensor to capture 40% more light than the RGGB filter. But getting accurate colors with RYYB is harder than with RGGB filters. Huawei relies on new image processing techniques based on AI to do this.

In addition to being more sensitive to light, the Mate 30’s sensor (1/1.7") is also bigger than the ones you find in the Pixel 4, iPhone 11, and Galaxy S10 — all of which feature a smaller 1/2.55" sensor.

RYYB in tandem with the large sensor, allows the Mate 30 to captured a whopping than other flagships. This gives it the ability to have the highest ISO (sensitivity of the sensor to light) in any smartphone. Huawei claims an ISO of 204800 in Mate 30 and ISO 409600 in Mate 30 Pro against an industry standard of 6400, putting it on par with expensive DSLRs.

Pushing the physical boundaries in smartphones sensors has allowed Huawei to achieve excellent low-light photography even without using its dedicated night mode, that is without using the computational tricks of multiple frames and long exposure found in Google’s Night Sight and Apple’s iPhone.

But the phone does also has a dedicated night mode which uses software tricks on top of the impressive hardware to delivery even better low-light performance. The dedicated night mode is also useful when using the ultra-wide angle and telephoto lenses which does not have the same hardware benefits as the primary lens.

If all this Huawei tech isn’t fascinating enough, the Mate 30 Pro, the higher-end variant, not only has excellent low-light photography but an industry first low-light videography as well. This is made possible with an additional, even-larger ultra-wide angle sensor (1/1.54") than the 1/1.7" RYYB sensor.

iPhone 11

Unsurprisingly, Apple has been vaguer about the science behind Night Mode in its latest line-up of devices, only mentioning the overview of the process on the product page. But if you get how Google’s Night Sight works, you know 95% of it works in the iPhone 11.

Shot on iPhone 11 Pro in Night mode by Austin Mann

However, the iPhone 11 does not have a dedicated night mode, instead, it automatically turns on when the scene is dark enough. Once night mode is activated, the remaining steps are similar to Google’s Night Sight.

While the Pixel doesn’t let you change the total shot duration, the iPhone does offer some flexibility. Tapping on the night mode icon will allow you to manually increase time, although the maximum time depends on the availability of light and stability of the device. This can range from 1–30 seconds with the higher range only possible when on a tripod. This is lower than the maximum 4-minute interval possible in the Pixel in astrophotography mode.

For shots taken in medium-to-low light, iPhone uses another fascinating computational trick know as Deep Fusion. You can read more about it here and here.


While Google’s Pixel 4, Apple’s iPhone 11 and Huawei’s Mate 30 series are the industry best for low-light photography, other flagships like the Samsung Galaxy S10 and OnePlus 7 Pro also offer dedicated night modes. The latter two work very similar to Google’s Night Sight in principle, but the quality of output from each device is different based on each company’s image processing algorithm. You can see how they compare here and here.

Low-light photography on smartphones has reached a point where the camera is able to see more colors and sharper images than the human eye, and with flagships continuing to push forward, the future of smartphone photography is looking brighter than ever.

The Startup

Medium's largest active publication, followed by +538K people. Follow to join our community.

Sarvesh Mathi

Written by

tech | politics | economy

The Startup

Medium's largest active publication, followed by +538K people. Follow to join our community.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade