Amazing Physically Based Rendering Using the New iOS 10 SceneKit

How to simply use the newly introduced iOS 10 Physically Based Renderer and get great looking results without the complexity of any advanced graphics engine.

Avihay Assouline
7 min readJul 27, 2016
Fitbit model rendered with iOS 10 Physically Based Renderer | Rendered on my iPhone 6 at 60 FPS | Model courtesy of Tal Negrin

No doubt, Apple is starting to get it when it comes to rendering on their mobile devices. The recent advancements in real-time rendering together with advanced hardware really opened up the opportunity to achieve high-end results without having to put a $2000 price tag on a device. Although the team at Apple introduced many advancements at WWDC 16', one improvement really caught my eye which is Physically Based Rendering (PBR) support in SceneKit. This rendering technique gained a lot of traction in the past few years and has become the de-facto industry standard for both engineers and 3D artists as one.

In this short article, I will try to demonstrate how to simply use the newly introduced iOS 10 Physically Based Renderer and get great looking results without the complexity of any advanced graphics engine.

Before we go ahead I want to give an honest disclaimer. This article over simplifies some concepts to give you an idea about the key elements in PBR.

Physically Based Rendering

iOS 10 Physically Based Rendering on my iPhone 6 | Materials courtesy of freepbr.com | Rusted Iron, Wood and Plastic. Cool, huh?

There are many articles on the web describing the PBR shading model. Remember when I said that Apple is starting to get it ? Well, at WWDC 16', Amauri Balliet from the SceneKit team gave a really good presentation on this subject. As a matter of fact, it is one of the best presentations I know on PBR shading that really fits a broad audience. If you already saw this presentation, feel free to jump right ahead to the coding section.

Image Based Lighting (IBL)

Illuminating a three-dimensional scene is not an easy task. If you take a look around you will see that many objects in your room contribute to the illumination of your room. Before this technique, rendering engines used various light models such as directional lights or spot lights. In some cases, the result was a nice approximation to reality but it always seemed too mechanical. Moreover, due to loss of high detail, our brain realized that the objects are not realistic.

Image Based Lighting takes a different approach. Over simplifying the idea goes a bit like this: Imagine that around your object there is a large sphere. On that large sphere, there is an image. Well … that’s your light source!

Now, what will happen if we will use an image, usually known as Environment Map, taken directly from the real world ? As you can see below, the shading becomes much more realistic and accurate.

Image Based Lighting | Left: The image we use as light source | Middle: The direction of light to the center of the model | Right: The final result | Image courtesy of 3delight

It is important to understand that different environment maps create very different renderings, so choose your environment map carefully. Choosing the right environment map is an art by itself, and is usually done by experienced 3D artists.

The Spherical Environment map used in this example | Courtesy of LugherTexture.com

Material Definition for Physically Based Rendering

Materials are crucial for great PBR shading. They provide the basic information for our shaders to make these amazing visualizations.

In the past, defining materials was very difficult. There were a million of parameters to tweak and it required quite a whole lot of black-magic and experience to get it right. Moreover, material definitions were very coupled with lighting which caused a problem in terms of material creation. With PBR shading, the story becomes far simpler. As a matter of fact, it is very easy to reach visually pleasing ceramics, plastics or metals using only two float numbers (see below). Basically you could have reached the same look and feel, but the artist would have to work hours to get the fine look they are looking for.

The three main parameters for PBR materials are called: Albedo, Roughness, and Metalness.

The Albedo component is the base color of the object, and is in direct relation to the diffuse component of other shading models. The albedo value is usually an RGB color defined usually by 3 float point numbers.

Roughness and Metalness - These values describe how rough or metallic a surface is. The image below explains quite nicely the effect of changing either value on the rendering result using only float values in the range of 0.0–1.0.

The rendering effect of changing either the roughness or metallic value in PBR materials. The albedo component is just a simple gray color — If you travel around this matrix you can find nice looking aluminum or gray plastic materials

To reach pro results, you will need to work with textures that will define the different PBR components for each point on your model. Luckily, most 3D artists are already familiar with this shading model and are publishing free material definitions that can help jump start any project.

Moreover, iOS 10 PBR materials also support other maps such as Normal maps which are optional for use but necessary to achieve great looking results.

For the purpose of this demo, I used some materials from freepbr.com. Make sure you check it out if you want to test the code (below) with different materials.

Coding Time

We will create together the rotating orbs example as seen below.
No need to worry — All code is published on GitHub.

Rotating Orbs — The demo we will create | Try to find alternative materials online for your own usage

First, a few prerequisite you need to be aware about:

  • Get the latest macOS, iOS and Xcode beta versions available — You don’t want to get stuck due to bugs already solved by Apple.
  • Don’t use the simulator — Always use a Metal supported device.
  • This is beta software — Bugs may and will arise

Ready to see how simple it is ? Let’s go

I assume you have some experience writing Swift code, and that you already created a SceneKit view once. If you haven’t, just start with the Game template and choose SceneKit as your game technology. The template code you will get should suffice for this exercise (I actually used it here…)

First step: Create a scene:

let scene = SCNScene(named: "sphere.obj")!

As you can see , it is fairly simple to init a scene directly from an OBJ file. No complex C++ file parser needed. We will use a simple sphere object created in 3D Studio Max that already contain UV coordinates so we can texture it later (If you are unfamiliar with the concept, read about it now).

Second step: Create a camera and position it:

let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
scene.rootNode.addChildNode(cameraNode)
cameraNode.position = SCNVector(x:0, y:0, z:1.5);

This is pretty straightforward if you are familiar with how a scene is represented via a Scene Graph (hence the Node notion).

Third step: Add environment lighting:

let environment = UIImage(named: "IBL.png")
scene.lightingEnvironment.contents = environment
scene.lightingEnvironment.intensity = 2.0

The first two lines are pretty self explanatory, you may wonder why we increase the intensity of the environment map. The answer: It just looks very good with the setup of my scene!. Yep, it may be hard to adjust, but just like real-world film makers do a bit of unexplained black-magic to get their shot great — so will you on computer generated scenes. There is no shame in that :)

An important note is that the API is making some assumptions on your image based on its size. Honestly, I find this a bit of bad coding practice which is more suitable for hardware engineers and a bit non consistent with the rest on the API’s but that’s life. Make sure you go over Apple’s documentation and understand the inputs Apple’s API are requesting. In our example, the API is inferring that we provide a spherical map from the fact that: image_width = 2 * image_height

Fourth step: Add spherical background:

let background = UIImage(named: "IBLBlurred.png")
scene.background.contents = background;

The background image we will use is our exact environment map, only blurred using a simple Gaussian Blur filter. We do this only for the user experience as it’s quite cool, however, there is no effect on lighting by assigning the background. As a side note, I will say that according to Apple’s documentation, in case you have a PBR material in your scene and there is no environment map defined, the PBR shader will use the background if it exists. I tried it. It still doesn’t work. Again — This is beta software.

Fifth step: Setup your material:

let material = firstNode.geometry?.firstMaterialmaterial?.lightingModelName = SCNLightingModelPhysicallyBasedmaterial?.diffuse.contents = UIImage(named: "albedo.png")
material?.roughness.contents = UIImage(named: "roughness.png")
material?.metalness.contents = UIImage(named: "metalness.png")
material?.normal.contents = UIImage(named: "normal.png")

I love self explanatory code :), and as you can see above, no magic tricks here. We find the material object we want to edit and assign the correct maps we got from our artist. For more information on assigning contents for these material properties, just look it up in Apple’s documentation.

Sixth step: Connect the scene to your SCNView:

let scnView = self.view.subviews[0] as! SCNView
scnView.scene = scene

C’est tout! Your first PBR application is up and running

I Love It! What’s next?

There are many other topics to cover such as HDR lighting, camera tricks and effects, light probes, Model I/O framework, working with external tools (e.g. 3DSMax, Blender, Maya), modifying shaders and much more. Travel around Apple’s documentation and videos to find these gems. Some of them can save you tens or hundreds of unnecessary hours of coding. Don’t re-invent the wheel.

If you would like to dive deeper into the math and physics of PBR shading, I suggest to view this great lecture by Naty Hoffman from SIGGRAPH 2015.

If you like what you read and would like to keep these coming, please tap ♥ below — It really fuels the next post

--

--

Avihay Assouline

Deep Learning @ Snap | xCTO @ Realdrift | xEng. Mgr. @Autodesk