Augmented Reality 911 — USDZ Schemas

Andy Jazz
Mac O’Clock
Published in
5 min readJun 15, 2021

Foreword

Universal Scene Description, or USD for short, is a popular file format for 3D industry that stores hierarchy of geometry, materials, lights, animations, physics and sound. Zero-compression USDZ file format was specifically tailored to suit the needs of iOS AR ecosystem. Pixar USD family formats were initially conceived with Schemas in mind, and Schemas have been shining for VR. However, there’s also a result of Apple and Pixar’s collaboration that fetch a new concept for us — AR Schemas.

AR USD Schema is a custom extension allowing you pythonically, or in other words using the higher-level API, define a new scene containing shaded models with behaviours and their corresponding types of AR anchors.

Of course, you have to write Python scripts in human-readable ASCII version of USD format — USDA. Before working with AR USD Schemas, make sure you installed USDZ Tools and properly set zsh resource file.

To discover more about USD concepts go to pixar.com and sidefx.com.

How to create USDA file from scratch

I should say that at the moment I’m running macOS Monterey with USDPython v0.66 installed. To create .usda file, launch Terminal and execute the following Python script. To call a Python interpreter (REPL) in Terminal just write python and press Return.

from pxr import Usd, UsdGeomstage = Usd.Stage.CreateNew('/Users/swift/Desktop/arScene.usda')
xformPrim = UsdGeom.Xform.Define(stage, '/Transform')
spherePrim = UsdGeom.Sphere.Define(stage, '/Transform/Sphere')
stage.GetRootLayer().Save()

We’ve just created a polygonal sphere model. Let’s open a new USDA file in TextEdit app. All you see there is just several lines:

#usda 1.0def Xform "Transform" 
{
def Sphere "Sphere"
{
}
}

However, those two lines are capable to fulfil the task — to create a white sphere primitive. You can look at that sphere if you select arScene.usda file on Desktop and press Spacebar (a shortcut running a Quick Look preview window).

Sphere primitive in arScene.usda file

Modifying USDA file

Let’s create a cube painted with grey metallic colour. At first we have to compose a metallic shader. It isn’t a rocket science, for that we must utilize diffuseColor, metallic and roughness parameters.

def Xform "Texturing" 
{
def Scope "Shaders"
{
def Material "Metal"
{
token outputs:surface.connect </Texturing/Shaders/Metal/
SurfaceShading.outputs:surface>
def Shader "SurfaceShading"
{
uniform token info:id = "UsdPreviewSurface"
color3f inputs:diffuseColor = (0.75, 0.75, 0.75)
float inputs:metallic = 1.0
float inputs:roughness = 0.2
token outputs:surface
}
}
}
}

With the release of USD, Pixar introduced a concept called prim. Prims are like RealityKit’s entities in the hierarchical tree of objects that define a stage (resulting scene graph of prims). For instance, a material is a prim, a mesh is a prim, a light is a prim, etc. Xform prim stores a transform matrix that can be applied to its child prims. def is a keyword defining a new prim (over is a keyword for overriding a defined prim).

Now we are ready to delete a sphere and build a cube instead.

#usda 1.0(
defaultPrim = "Primitive"
endTimeCode = 300
startTimeCode = 1
upAxis = "Y"
)
def Xform "Primitive" ( prepend apiSchemas = ["Preliminary_AnchoringAPI"]
)
{
uniform token preliminary:anchoring:type = "plane"
uniform token preliminary:planeAnchoring:alignment = "vertical"
def Cube "SilverBox"
{
# Transform and Material
double size = 0.25
rel material:binding = </Texturing/Shaders/Metal>
color3f[] primvars:displayColor = [(0.5, 0.5, 0.5)]
uniform token[] xformOpOrder = ["xformOp:translate"]
float3 xformOp:translate = (0.0, 0.1, 0.0)
# Basic Animation float xformOp:rotateY:spin.timeSamples = { 1: 0, 300: 1800 }
uniform token[] xformOpOrder = ["xformOp:rotateY:spin"]
}
}

I should give some explanation. “SilverBox”, for example, is the name of the prim that will appear in the scene tree as /Primitive/SilverBox. Python’s prepend apiSchemas command (it’s the opposite of the habitual append command) allows us implement anchoring API before any 3D content. Also we’ve applied a spin animation about Y axis.

When we open Quick Look viewer we’ll see a reflections on metallic surface of animated cube. These reflections are generated by Pixar’s built-in HDR image.

Preliminary_AnchoringAPI class may have the following content.

class "Preliminary_AnchoringAPI" (    inherits = </APISchemaBase>    customData = {
token apiSchemaType = "singleApply"
}
)
{
uniform token preliminary:anchoring:type (
allowedTokens = ["plane", "image", "face", "none"]
)
uniform token preliminary:planeAnchoring:alignment (
allowedTokens = ["horizontal", "vertical", "any"]
)
}

As we can see here, at the moment we are capable of using none, plane, image or face anchor types. I’ve chosen plane here — for RealityKit this anchoring type will be equivalent to AnchorEntity(.plane(..)).

Press Save shortcut.

Converting in Terminal

I must declare — our .usda file is ready. Within a minute, we will convert it into .usdz file. For this open a Terminal app and run this simple command:

usdzconvert /Users/swift/Desktop/arScene.usda

The command works in bash and zsh.

Coding in Xcode

Open a new AR App storyboard template in Xcode 13, then drag arScene.usdz file and drop it into Navigator Area (left pane).

Then replace a code found in ViewController.swift file with this code:

import UIKit
import RealityKit
class ViewController: UIViewController { @IBOutlet var arView: ARView! override func viewDidLoad() {
super.viewDidLoad()
let usdz = try! ModelEntity.loadAnchor(named: "arScene",
in: nil).
arView.scene.addAnchor(usdz)
// Animation
print(usdz.availableAnimations.count) // prints 1

let animation = usdz.availableAnimations[0].repeat()
usdz.playAnimation(animation)
}
}

Animation doesn’t play automatically in RealityKit. We must start it using the playAnimation(_:) instance method.

Compile!

After all, it would be nice to see the entire scene hierarchy of our .usdz . It can be printed in Xcode’s Console.

print(usdz)

Now let’s make sure that a target is the plane.

print((usdz.children[0].anchor?.anchoring.target)!)

It’s true.

// plane(AnchoringComponent.Target.Alignment(rawValue: 2), 
classification: AnchoringComponent.Target.Classification(
rawValue: 18446744073709551615),
minimumBounds: SIMD2<Float>(0.0, 0.0))

Now AR app is able to automatically detect and track a vertical plane and stick our silver animated cube to it.

Cool. We’ve got it. That’s all for now.

If this post is useful for you, please press the Clap button and hold it.

¡Hasta la vista!

--

--