TrueDepth Camera system might help with better night selfie portraits.

Bhavesh Rawat
Mac O’Clock
Published in
3 min readJan 16, 2021

Apple has been known for its innovations, following a simple “If it ain’t broke, don’t fix it” philosophy.

Credit: Apple

Now be it, introducing multi-touch display in a smartphone back in 2007, then being first to introduce biometrics as passcode in a smartphone (sic.) TouchID.

On the other hand, being courageous enough to remove the headphone jack so that the AirPods they were about to launch make much more sense to users, plus users weren’t really up for carrying separate wire (Lightning to 3.5mm). Well, the removal did help with the battery keeping the iPhone thin.

Speaking of AirPods, they changed the way people listen to music, Apple made listening more fun, accessible so that there’s no more fiddling with the wire.

In 2017, Apple launched iPhone X with that came all-screen design and an upgrade to TouchID, FaceID.

That tiny strip called notch packs a set of hardware coupled with Apple-designed Neural networks that makes FaceID reliable, secure, and easy. And by the time till now the FaceID has been nothing but improved.

Now, let’s start by understanding how FaceID actually works.

  1. The proximity sensor and ambient light sensor help the TrueDepth camera system determine how much illumination will be needed for face recognition;
  2. The flood illuminator produces infrared (IR) light, part of the electromagnetic spectrum that’s invisible to the naked eye, to illuminate your face;
  3. The dot projector produces more than 30,000 dots of invisible IR light to create a three-dimensional map (for area and depth) of your facial landscape;
  4. The infrared camera captures images of the dot pattern and the IR light (a heat signature) that’s been reflected from your face.
YT video for better illustration | Credit — Tech Insider

Now, what I was thinking that what if Apple make use of FaceID’s hardware with the TrueDepth camera to improve night portrait shots just like they did with the rear camera by adding a LiDAR scanner.

LiDAR uses active sensors that emit its own illumination source. The energy source hits objects and the reflected energy is detected and measured by sensors. Distance to the object is determined by recording the time between transmitted and backscattered pulses and by using the speed of light to calculate the distance traveled.

Timestamp 0:41 | Mark Spurrell

Just like LiDAR can emit the rays, Dot Projector coupled with Flood Illuminator will shoot Infrared light to illuminate your face and produce more than 30,000 dots of IR light to create a three-dimensional map (for area and depth) of your facial landscape of the subject with boundaries. LiDAR can receive the bounced off rays as well, and here’s when the infrared camera comes into play, it will capture images of the dot pattern and the IR light (a heat signature) that’s been reflected from your face and sends that data to the ISP to calculate the depth between subject and background to create an accurate bokeh even in night shots which is probably involved in post-processing.

Credit: Apple

Well, now the question rises, is A14 Bionic enough powerful to do so in realtime, or A14 Bionic’s ISP is capable enough and could be integrated with FaceID hardware.

Well, let alone the integration part to Apple Engineers. But I’m positive that A14 Bionic is capable enough to do that. Well, it’s encoding an HDR Dolby Vision video in 4K. Not only that, real-time LiDAR usage with camera for AR and Measure app, and so is ISP.

So, that’s been it, it was just a hunch (or say it an idea) that’s been up on mind for a while. I thought what could the best way to handle than sharing it with the tech community.

Hope you enjoyed it. 😉

--

--

Bhavesh Rawat
Mac O’Clock

22 • Frontend Engg. • Tech Enthusiast • Blogger • Curator