Reviewing Nuke’s Virtual Reality Stitch Tool
At First Glance
Beta testing a new software often requires infinite patience and a good sense of humor. Sometimes the button you hit doesn’t work the way it seems like it should. Or doesn’t work at all. Or sometimes, as with the nodal pan settings in the Camera Solver works — but only if you put the settings through twice.
I try not to let this fact discourage me, in fact, I find it entertaining that on very similar shots, the Camera Solver will sometimes give very different settings. Or if you set the focal length to ‘optimize single’ and bring in a stitched PTGUI solved image, it seems to write over the settings rather than use data as expected.
One of the more simple challenges I had was getting one shot stitched that was very similar to another shot but was just not solving. Coming from the compositing world of hacks, I initially tried ideas like warping the images before putting through the solver to see if it would give me a better solve. I don’t recommend this idea.
Here’s a different tactic.
I learned, however, if you have solved a shot with the same cameras and are having trouble on a different shot, you can manually input the settings from the distortion and camera position into the Cameras tab and it will often magically work. Well, not magically. I’m struck by Ocula and VR in general how mathematical solving challenges can be.
If some area isn’t lining up, the Camera Solver node will let you export cameras so you can see if the cameras look like the rig you used. If they don’t, it’s back to the drawing board. If it is lining up but there are some minor warping issues, both the Camera Solver and the Monostitcher will let you export STMaps, which, if pre-rendered on a single frame, will speed up render times and RAM drastically on the farm. The Camera Solver lets you export separate VR_ transforms for serious noogling of the plates.
The Colour Matching tool
The Colour Matching tool seems to work in some situations but not in others. A darkly lit shot I had seemed to make the problem worse, but I saw it solve a daytime shot beautifully on another commercial.
The VR Sight Node and I have a love/hate relationship. In terms of painting, this node uses the alpha channel to calculate where the image pulls through and re-projects, but it is very buggy and doesn’t always hold up when doing patches. It is best for using only their rotopaint node. Also be aware of how much RAM it takes to render this node. Multiple VR_Sight Nodes will eat up fast render times for breakfast.
The VR_Sight node also doesn’t seem to be a full latlong. I calculated out that it was giving out a 1.9 ratio instead of a 2–1 ratio when flattening and projecting a plate for stabilization. It hasn’t mattered in my current project, but I could see that being an issue in the future because VR is so mathematical in nature.
The VR Transform Node
This node is wonderful for undistorting plates and flattening out horizon lines, to name a few of its features. You can go from fisheye to spherical and add a second transform to go from spherical to rectilinear, but you must use two nodes to do this.
Spherical Transform Node
You can also transform a latlong map into ZXY space, as an example, with a rotation of RX -90 in order to paint holes in the floor of your stitch. Then you can rotate it back and it will project your fixes manually.
VR Blender Tool
If you want to separate out transforms for manual movement and rotation, be sure to add a VR Blender tool (or put through the VR_Monostitcher, however, only if your computer can handle GPU acceleration) after the join views or there won’t be any change at all. It’s one of those gochas not immediately obvious!
Features I’m looking forward to
One big feature request would be to add a plate distortion analyzing tool for fisheye lenses. Although the methodology of using a VR_Transform to undistort the plate is good in theory, it doesn’t account for the nuances of distortion that can occur that can make exact lineups difficult.
Despite its quirky personality and many bugs and want list items, I would have to say that I enjoy the flexibility that Nuke affords to its artists. The traditional warping and pan and tiles are time consuming and frustrating. Additionally, having the power of Nuke to add CG correctly aligned as opposed to using Kolor or Video Stitch makes it a great choice for compositing in VR. I would much prefer having it than not.