How to Use Material Nodes in Reality Composer Pro’s Shader Graph

Mark Robinson
10 min readJun 23, 2023

--

One of coolest new features I saw during the WWDC2023 session videos was the addition of a node-based shader system in the new Reality Composer Pro app and how you can use it in spatial experiences with RealityKit. Because I’ve worked in node-based renderers (like Redshift), and have a little bit of experience with creating custom shaders (in Three.js via glsl), I was super excited when I saw Niels Gabel’s session on materials in Reality Composer Pro. Not only can you create custom surface shaders (similar to a fragment shader) in the node-based system, you can also create geometry modifier shaders (similar to a vertex shader).

This new node-based way of working really opens the gates for creatives who are intimidated by writing their own Metal code (me lol) so we can make and experiment with custom shaders. All it takes is a little bit of know-how and curiosity to get started…

I plan to dive deeper into working with this node-based system to create some slightly more complex shaders in future posts, but for now I wanted to just give a quick intro into how to get started.

This first post walks you through creating a boilerplate project in Xcode, opening Reality Composer Pro, and assigning your first node to drive the color on a material. If you want to go further, I encourage you to try my tutorial on YouTube where I re-create the macOS Monterey surface shader in Reality Composer Pro.

What you’ll need:
- Xcode 15.2 Beta (must be on a Mac running macOS Ventura to install)
- Add the VisionOS SDK (either via Xcode when you open it, or download separately)
- Some experience with other 3D DCCs (Cinema 4D, Maya, etc)
What’s helpful to know:
- Basic understanding of shaders.
- Experience with a node-based interface.
- Some math knowledge

Background
What’s a shader? If you really want to know, do a search and find somebody more knowledgeable than me. But if you can get by with a cursory explanation from an amateur…
A shader is essentially a tiny program that tells your computer how to display something on screen. The shaders are assigned to a 3D object’s geometry and make up the ‘material’ of the object. These materials actually have two parts (shaders).

The first is called a vertex shader (in our Reality Composer Pro context, a geometry modifier): It calculates where the object is in 3D space and how to translate that position into the 2D image you see on your screen. The basics of this are that given the coordinates of each piece of geometry that makes up your 3D scene, the computer will use the vertex shader to calculate whether your object can even be seen by the current view (maybe something is in front of it or it’s behind the camera), how it should be shown to give a sense of depth to the 2D image (aka, objects further from the camera should be smaller), and also how the field of view of the camera will make the object appear (a wide field of view will have distortion on the sides of the image). The vertex shader is essentially where you can manipulate the shape of the object.

The second part is a fragment shader (in our Reality Composer Pro context, it’s a surface shader): it takes the information handed to it by the vertex shader (aka, all these things appear on-screen), and it figures out how to color each pixel. The computer will look at one pixel, figure out what piece of geometry it’s rendering (via the vertex shader) and then do calculations to figure out what the final color of the pixel should be. This can be as simple as saying: this is a red box, so wherever we see this box, it’ll be red. However, you wouldn’t get a sense of depth to be able to decipher the edges or faces of the box. So in addition to color, a fragment shader will also calculate how the object at this pixel is being lit by lights in the seen, or if the object is reflective, perhaps its color should be influenced by the objects around it. Both types of shaders are topics that you can go down the rabbit hole on, and the only limitation will be your knowledge of math and physics. The reason renders in VFX films look realistic is because there are and have been brilliant minds working to apply the rules of we perceive the real world into generating these images. But besides making photo-realistic renders, we can also utilize these same tools and display things in a creative way with just some basic math.

Okay, let’s dive in!

1. Boilerplate App and Launching Reality Composer Pro
The goal here is to get into Reality Composer Pro as quickly as possible, and just use Xcode as a way to get a boilerplate visionOS app set up so we can view our scene in the simulator. After Reality Composer Pro has been installed, you can open it directly, however then you’ll have to coordinate bringing your scene into Xcode to be able to see it in the simulator, so this is an easier way to lean on Xcode and get something in the simulator as fast as possible.

  • Open Xcode 15.2 Beta
  • Choose ‘Create New Project’
  • Go to the ‘VisionOS’ tab and select ‘App’. Click Next.
  • Give it a name. Set the initial Scene to ‘Volume,’ and the Immersive Space to ‘None’. This means that our app will launch with a boilerplate 3D scene in a ‘Volume’ window, and we can just edit this scene via Reality Composer Pro without having to write any code.
Name the project, choose Volume for Initial Scene and None for Immersive Space.
  • You’ll be presented with the boilerplate app. If you select the ‘ContentView’ file from the navigator in the left pane, it will show a preview of our app (currently just a gray sphere floating with a button).
The default visionOS Volume App, with boilerplate AR scene.
  • The 3D scene presented in the content view is automatically added by Xcode based on the assets found inside of a RealityKit package. To edit the scene, we just have to navigate to the Package file in the navigator that’s found inside Packages/RealityKitContent/ directory. Once you select this file, you’ll see a view of our scene (our grey sphere). In the upper-right corner you’ll see a button to ‘Open in Reality Composer Pro’. Click it and Reality Composer Pro will launch.
Navigate to the Package in the hierarchy, which shows the default scene. Click ‘Open in Reality Composer Pro’ to launch RCP.

2. Working in Reality Composer Pro.
The goal here is to give a quick rundown on how to work in Reality Composer Pro, and move on to using the Shader Graph nodes to tweak our material.

  • When Reality Composer Pro launches, it will look familiar to anybody that’s worked in another 3D DCC before. The main viewport shows the same scene that we saw in Xcode. A pane on the left that shows the scene’s heirarchy. A pane on the right shows an inspector where you can adjust parameters for whatever is currently selected.
  • The bottom pane contains four workspaces you can toggle via the tabs in the middle:
  • A. Project Browser, is an overview of the assets in the rkassets folder that was generated by Xcode (to view it in finder, right click the name ‘RealityKitContent.rkassets’ under Project and select ‘Open in Finder’).
  • B. Shader Graph (starts empty), but this is where we’ll edit our material shortly.
  • C. Audio Mixer (starts empty), we won’t use this in the tutorial.
  • D. Statistics. This is where you can see the stats of your scene to address performance issues / optimize.
  • You can move around your scene by clicking/dragging in the viewport. There are six buttons at the bottom left of the viewport that can help you navigate. They are: Selection Mode (Has a similar feel to other DCCs — clicking and dragging orbits, clicking selects objects, middle mouse/scroll dollies the camera), Look Mode (camera stays still, clicking and dragging just pans/tilts the camera), Pan Mode (clicking + dragging moves the camera (tracks/peds up/down — Not panning lol, bad Apple, bad Apple!)), Orbit Mode (clicking dragging does the same thing as Selection Mode except you can’t select objects), Dolly Mode (clicking dragging doilies in and out), Reset Camera (this will reset the camera to a default position).
The different camera modes for navigating the viewport.
  • Great, that’s enough to get you started. Let’s delete the sphere and the placeholder material from the scene. Just select in the hierarchy and delete.
Remove the default content from the scene.
  • Let’s import our little landscape model. Go to File -> Import. Check the ‘Add to current scene’ checkbox.
Add your model by importing it into the project and add it to the current scene.
  • Great, you’ll notice that because there are no materials in our scene, the model has that pink striped look. Let’s create a material. Click the + icon at the bottom of the hierarchy and select Materials -> Physically Based.
Add a new physically based material.
  • Now select your material in the hierarchy, and click on the Shader Graph in the bottom pane. Click ‘Create Material’ and two nodes will be added — a MaterialXPreviewSurface node and an Outputs node.
Select the new material. Open the Shader Graph pane and click ‘Create Material.’ Two nodes will be populated for you.
  • Assign the material to your geometry so we can see it instead of those pink stripes: Select the geometry in the hierarchy and look at the ‘Material Bindings’ parameters in the inspector pane. Under Binding, select our new material. Your object should now take on the default material look in the viewport.
After selecting your object’s geometry in the hierarchy, the inspector will show options for material bindings. Choose your new material from the ‘Binding’ dropdown.
  • Let’s turn our attention to the ShaderGraph area. What are these two nodes? The Outputs node receives the data that is fed to the renderer to determine how our object is rendered and it has two inputs on it: Custom Surface and Custom Geometry Modifier (aka fragment shader and vertex shader). We’re just going to work with the Custom Surface data, and we’ll leave the Custom Geometry Modifier port empty. The MaterialXPreviewSurface Shader has a series of properties that can simulate real-life materials (color, roughness, metalness, opacity, etc). You can either set these values to a constant in the Inspector (for instance you could give the entire object a roughness value of 1.0 for a completely rough surface), or you can drive these value by piping data into the ports on the node (for instance, maybe you have a roughness texture that you can load in a texture node and feed into the roughness port).
Anatomy of a node.
  • Since we’re just trying to get the hang of the shader graph, let’s start by adding a simple flat color to drive the materials Diffuse Color. Click the ‘+ New Node’ button in the top right.
To add a new node, click the + New Node button, or just double click in an empty space of the shader graph window.
  • A dialogue pops up showing all the available nodes. Search for color. Select ‘Color3 (Float)’, and the node will be added somewhere on the shader graph. You can click and drag the node to reposition it if you like.
Search ‘Color’ in the dialogue and select the Color3 (Float) node.
  • Now click on the Color3 (Float) node (it will be labelled ‘Constant’ by default, but you can rename it as you see fit). When you click on it, a color picker appears in the inspector next to ‘Value’.
When you select a node, it’s available properties will be displayed in the Inspector.
  • Click it and pick a color. You’ll notice our object didn’t update in the viewport. This is because while we have the color data available to us now, we aren’t using it anywhere.
Pick a color. After editing the value input, you won’t see the color reflected on your model yet.
  • To use the color we created click on the output port (the little grey dot on the right side) of the Color3 node and drag to the ‘Diffuse Color’ input on the MaterialXPreviewSurface node.
To connect nodes, click on the output of one node and drag to the appropriate input on the receiving node.
  • Ta-Da! Our model is colorful!
Now we can see that the input value from the Color3 (Float) node is driving the diffuse color of our material!

3. View it in AR via the Preview window or the Vision Pro Simulator. In order to send the edits you made in Reality Composer Pro back to Xcode, all you have to do is save the scene. Once you do, the Preview of your ContentView will be updated with the changes. If you want to see your Volume in the Apple Vision Pro simulator, just Build and Run the Xcode project and the simulator will launch. Select your app form the home screen in the simulator, and voila! Your scene is available to view in AR!

4. Next Steps — In the follow-on posts to this series, I’ll be showing how to use nodes to make slightly more complex shaders like the one below, which is colored based on the position of the object and changes color over time. Until then, happy experimenting!

In future posts, I’ll go over how to use the nodes in the shader graph to create more custom shaders, like this one that utilizes the position of the model for color and changes over time.

If you have any questions or notice an error, please let me know by leaving a comment! And if you found this helpful please let me know by applauding or following me.

Happy noding!

--

--