Logistics and Technical Skills Refresher
With the great impromptu short film submission of a kids retelling of The Princess Bride, I figured it would be good to get back to Water Hazard between editing some other work, a sizzle reel for a comedy group and finalizing a past project.
For some logistics and collaboration on the next shoot I started playing with Gorilla, Scenechronize, Yamdu, StudioBinder, and Celtx and their studio support software that lets you have one place to keep track of scripts, storyboards, shot lists, script breakdowns (who knew how important that was?), call sheets, schedules, master catalogs and budgets. Overall Celtx was the best of those I tried, although StudioBinder I liked the look of more and Gorilla is likely more feature complete. Celtx does more than is necessary for what I’m dealing with, and the script breakdown was incredibly useful to keep track of all the items called out specifically in any script. You can also share this information with your crew and actors, which in turn helps them keep apprised of what is happening.

In one of the shots, there is a close-up of a thermostat with specific temperatures on it. Rather than doing anything practical, I figured that would be a great chance to do a shot with some level of digital replacement (cause hey, why not, it’s a chance to improve skills in that area). I’ve done a few things like monitor replacements before, and it took me a long time and wasn’t very good, but wanted to see if I could resurrect that particular skill.

So, I started by filming a short sequence on my iPhone with a tracking marker in it. I then looked at the various options of tracking. After Effects has always been able to do it, however, I’m not a guru so I personally like to spend as little time as possible in AE (that’s much more of Chad’s area of expertise). I have done some playing with planar tracking with Mocha Pro and camera tracking with PFTrack (an early generation) but decided to try some of the newer solutions.
I wanted to see how SynthEyes would work since it seemed fairly accessible and very capable. It was pretty easy to track the marker on the fridge (I just did it by hand rather than do the auto-track features) and I think it did a decent job of the camera solve based on what I see as the movement in the views.

Then I exported the track (in the demo version, you get minimal export and no save capability) and tried to load it into AE. That sort of worked, but wasn’t obvious what it did, so then I figured I’d go back in the past to the tool I used to use, Shake. Thankfully I managed to get it up and running even in the newest OS X.

It’s still a pretty awesome tool for compositing, although I think Fusion is a better replacement for modern times. However, I’m not sure if it’s the way SynthEyes was exporting a Shake script or if it was my foggy memory on Shake, I couldn’t get the end result to composite well with tracking information. I couldn’t make it work with After Effects compositions either with the export — sure it loaded fine and it has a camera definition, but the camera seemed very static and not match moving with the tracking info. It might be I continue to lack skill in both these tools or that the limits put on the trial version of SynthEyes mean I wouldn’t actually see an effective tracking of a camera solve in either of these tools (and I lack the knowledge to verify it was done correctly).

So, with that, since I wanted to refresh some skills, I turned to Mocha Pro and AE and did a simple roto, track, and replacement. Nothing too hard, but it still took me an hour or two to get it sorted out (pros can do this in minutes), and even taking the time, it’s far from perfect (no colour match, slight jitters on the edge, not enough appropriate motion blur on composited piece). At least, it’s confirmation, I’ve not completely forgotten everything.

So, a very simple planar tracking replacement — which won’t actually work for the thermostat 3d object insertion I’m envisioning. Will have to have some further conversations with Chad and leverage his skills, or alternatively, *gasp* make it a practical effect.
[ Update: 26 February 2017 ]
I’ve had a few conversations with the owner of the company who built SynthEyes to try to help me get my issue sorted on why I wasn’t seeing what I was expecting to be seeing. Chad and I looked at the exported data and found exactly what we saw in After Effects, that there was no changing camera data being exported. That lead us to the conclusion that when it exports the first 6 frames of your camera solve data as part of the export (to test the import into your 3d package), that it meant the first 6 frames of your original footage and not the footage subselection you were working on (I was dealing with frames 180–345 of a 0–2167 total sequence length). We verified that and once I rolled back to frame 0–345, I got what I was looking for, the camera data for the first 6 frames that was correct, that I could then use to match move.

I then downloaded the learning editions of 3DEqualizer4 and PFTrack. Once I watched a tutorial, PFTrack was pretty awesome to rapidly track, refine and orient the camera in the world. 3DE also seems quite good based on this tutorial as well where it shows tracking the camera and an object. I assume I can get there with SynthEyes as well, but it at the outset seems not quite as intuitive and the same exercise of picking user trackers seems to jump off the rails more than with PFTrack … I just haven’t played with it enough yet I think, there’s likely something I’m missing.
