ARKit 911 — Light estimation

Andy Jazz
Mac O’Clock
Published in
4 min readJul 24, 2020

In ARKit, to get a well-tracked, properly-lit and HQ-rendered model we have to go through three main stages: Tracking, Scene Understanding and Rendering. Today we’ll talk about second stage’s component — light estimation. In RealityKit, light estimation is automatic, but here, in ARKit, we have to thoroughly calculate how much light is cast upon a shader’s surface at any particular moment of time. In this story we’ll discuss a light estimation for world tracking scenario as well as for face tracking.

• Tracking
Camera Tracking
Object Tracking
• Scene Understanding
Plane Detection
Ray-Casting
Scene Reconstruction
Light Estimation
• Rendering
Physically-based Shading
Raytraced Shadows
Depth Map Shadows
HDRI Lighting

Light estimation for World Tracking

To appropriately apply a light estimation algorithm when running World Tracking config, we need two gettable properties: ambientIntensity and ambientColorTemperature.

@available(iOS 11.0, *)
open class ARLightEstimate: NSObject {
var ambientIntensity: CGFloat { get } var ambientColorTemperature: CGFloat { get }
}

Let’s see what Apple documentation says about these instance properties:

Ambient Intensity’s value is based on the internal exposure compensation of the camera device, and scaled to be appropriate for use in rendering architectures that use realistic lighting metrics. A value of 1000 represents neutral lighting.”

Ambient Color Temperature’s value is based on the internal white balance compensation of the camera device, and scaled to be appropriate for use in rendering architectures that use realistic lighting metrics. A value of 6500 represents neutral (pure white) lighting; lower values indicate a “warmer” yellow or orange tint, and higher values indicate a “cooler” blue tint.”

We need to retrieve both values from ARFrame’s lightEstimate property. And, as you can see, this property is optional and of type ARLightEstimate.

let session = ARSession()session.currentFrame?.lightEstimate?.ambientIntensitysession.currentFrame?.lightEstimate?.ambientColorTemperature

Consider, intensity and color temperature are updated 60 times per second.

Now it’s time to paste these lines into our code but at first we should create a model for testing (it’s a green sphere) and enable light estimation feature.

import ARKit
import UIKit
import SceneKit
class ViewController: UIViewController { @IBOutlet weak var sceneView: ARSCNView!
let ambientNode = SCNNode()
private func modelAndMaterial() { let sphereNode = SCNNode()
sphereNode.geometry = SCNSphere(radius: 0.25)
sphereNode.position.z = -2.0
let material = SCNMaterial()
material.lightingModel = .lambert
material.diffuse.contents = UIImage(named: "greenTexture")
sphereNode.geometry?.materials = [material]
sceneView.scene.rootNode.addChildNode(sphereNode)
sceneView.scene.rootNode.addChildNode(self.ambientNode)
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)

sceneView.delegate = self
sceneView.scene = SCNScene()

let config = ARWorldTrackingConfiguration()
config.isLightEstimationEnabled = true
sceneView.session.run(config)

self.modelAndMaterial()
}
}

Ahora vamos a estimar la intensidad de la luz y su temperatura de color.

extension ViewController: ARSCNViewDelegate {    func renderer(_ renderer: SCNSceneRenderer,
updateAtTime time: TimeInterval) {
guard let lightEstimate = sceneView.session.currentFrame?.lightEstimate else { return } self.ambientNode.light = SCNLight()
self.ambientNode.light?.type = .ambient
self.ambientNode.light?.intensity =
lightEstimate.ambientIntensity * 1.15
self.ambientNode.light?.temperature =
lightEstimate.ambientColorTemperature
print(lightEstimate.ambientIntensity)
print(lightEstimate.ambientColorTemperature)
}
}

Light estimation is based on camera imagery only and sometimes real-world lighting and virtual lighting don’t match.

Compile your AR app and test it.

Partially cover a camera lens with your index finger.

Hooray! It works, doesn’t it?!

Light estimation for Face Tracking

When you run a Face Tracking session you have a little bit different scenario with properties found in ARDirectionalLightEstimate class.

@available(iOS 11.0, *)
open class ARDirectionalLightEstimate: ARLightEstimate {
open var primaryLightIntensity: CGFloat { get } open var primaryLightDirection: simd_float3 { get } open var sphericalHarmonicsCoefficients: Data { get }
}

And again, let’s take a look at Apple documentation:

Primary Light Intensity is the estimated intensity, in lumens, of the strongest directional light source in the scene”.

Primary Light Direction is a vector indicating the orientation of the strongest directional light source in the scene”.

Spherical Harmonics Coefficients is a data describing the estimated lighting environment in all directions”.

To implement it, use the session(_:didUpdate:) delegate’s method inside ViewController’s extension that conforms to ARSessionDelegate.

func session(_ session: ARSession, didUpdate frame: ARFrame) {    let estimator = sceneView.session.currentFrame?.lightEstimate    guard let faceLight = estimator as? ARDirectionalLightEstimate 
else { return }
self.lightNode.light = SCNLight()
self.lightNode.light?.type = .directional
self.lightNode.light?.intensity =
faceLight.primaryLightIntensity
self.lightNode.simdEulerAngles =
faceLight.primaryLightDirection
print(faceLight.sphericalHarmonicsCoefficients)
}

This primaryLightDirection property represents the average of directional light sources in the scene. This vector is normalized and it’s in world coordinate space.

Of course, session’s configuration must be ARFaceTrackingConfiguration.

override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
sceneView.session.delegate = self let config = ARFaceTrackingConfiguration()
config.maximumNumberOfTrackedFaces = 2
config.isLightEstimationEnabled = true
sceneView.session.run(config)
}
Canonical face mesh with Light Estimation applied

Light estimation in RealityKit 2.0

As I said earlier, RealityKit’s light estimation algorithm is on by default. If you want to turn it off, use the following code:

let arView = ARView(frame: .zero)arView.renderOptions.insert(.disableAREnvironmentLighting)

Tip: Previously, in RealityKit 1.0, we used .disableAutomaticLighting type property which is now deprecated.

If you need more info about lighting in RealityKit, please read this story.

That’s all for now.

If this post is useful for you, please press the Clap button and hold it.

¡Hasta la vista!

--

--