Kartaverse Journeys
6DoF VP (Virtual Production) Learning Resources for the Rest of Us
Created By:
- Andrew Hazelden (andrew@andrewhazelden.com)
Overview
Hi. If you are interested in reading through the Kartaverse reference material about the creation of a next-generation XR/virtual production ecosystem, you can dive through these links below.
Google Docs | Kartaverse 6 Project Onboarding Notes
Kartaverse Roadmap Docs
Google Docs | Kartaverse x LexhagVFX | Cloud to Camera VP Workflows
Google Docs | Bidirectional Guided Pathtracing for XR and VP
Kartaverse 6 “Valkyrie Engine” PDF Roadmap
Kartaverse NeRFian Portable 6DoF Capture Rig
A long-term Kartaverse led effort is underway to help build a new automated camera system “open reference design” that is NeRF-like output focused. The goal is to enable low-cost experiments into 6DoF VP (virtual production) digital environment capture.
Kartaverse 5 Pipeline Guides
A “Kartaverse Workflows | Immersive Pipeline Integration Guide’’ learning resource was made available Nov 2022.
The free guide covers the primary steps required to get a working content creation pipeline established from scratch for artists working at a freelancer, or small boutique studio scale of operation. Additional, expanded chapters cover broader Resolve/Fusion page usage concepts that will help you take your composting and 360VR content creation skills to the next level.
The guide is available in two formats:
The (free) Creative-Commons licensed “Immersive Pipeline Integration Guide” is filled to the brim with useful knowledge. It was created to provide a single-stop resource that unifies a lot of separate abstract ideas that cross many domains of skill and specialization in the XR post-production sector.
Kartaverse 6 for Assimilate LiveFX Port
Kartaverse is working with Assimilate’s Mazze Aderhold and Peter Huisma to create a new real-time VP stage plugin integration connection for Kartaverse <> Assimilate LiveFX. This R&D is happening in LiveFX using an OpenFX based plugin interface approach.
The 3rd party plugins will receive the per-frame onset ZEISS NCAM camera tracking 4x4 translation data via the OpenFX plugin image metadata. This camera tracking data allows 3rd party image generators to run on top of the active framebuffer “texture surface” in LiveFX.
Onset DIT Video Village Tech
The Kartaverse VP Tools use the Khronos Group “OpenXR” framework to connect to an on-set DIT video-village hosted passive stereo 3D monitor. This live-stereo review functionality is based around using OpenXR + the Monado framework + the Kartaverse OpenDisplayXR toolset.
Google Docs | OpenDisplayXR Project Update
Monado Based OpenXR Virtual Devices
- Google Docs | OpenDisplayXR Windows Guide
- Google Docs | OpenDisplayXR Linux Guide
- Google Docs | Using a 3D PluraView Display with an M1 MacBookAir
Onset DIT Centric Passive Stereo 3D Display Hardware
The primary onset DIT display system I am testing Kartaverse 6 against is a Schneider Digital “VR PluraView” passive stereo 3D monitor that uses lightweight polarizer glasses, along with a “3dconnexion Space Mouse Enterprise” input device.
Schneider Digital Website | VR Pluraview Product Page | Biomedical Use Cases
YouTube | Schneider Digital | VR PluraView — Passive 3D Desktop Medical Monitor
Schneider Digital Website | VR Pluraview Product Page | CAD & DCC App Use Cases
https://www.3d-pluraview.com/en/application-field/vr-pluraview-in-cax-applications
YouTube | Schneider Digital | 3D PluraView — Passive 3D stereo monitor for CGI and DCC applications
Kartaverse Cloud
Kartaverse 5 has an existing Amazon AWS deployment guide to help XR users make a personally managed cloud instance. This deployment process is being upgraded for Kartaverse 6.
Google Docs | Kartaverse 5 on Amazon EC2 Cloud Deployment Guide
Kartaverse 5 is cloud-ready for efficient multi-view workflow automation.
Scalable VP Edge Compute in the Cloud
Kartaverse 6’s R&D team has pre qualified Amazon AWS EC2 “g5.48xlarge” machine type, and the Google Cloud “C3D” machine type as the best in class offerings for next-generation compute infrastructure.
Edge compute hardware allows for rock solid reliability when launching AAA level media projects using best-in-class 6DoF lightfield driven volumetric virtual production technology.
Kartaverse Cloud based XPU render node systems are typically hosted in the nearest data centre region that is able to stream, with ultra low latency. The live rendered information is passed from the data center’s internal network over to the internet backbone.
This happens using ultra-high speed fibre connections, with network link speeds that operate between 10 Gbps to 100 Gbps. This high throughput networking allows numerous parallel data streams to arrive directly at the VP stage.
Distributed Pathtracing
Distributed rendering allows for the rapid ideation of design concepts using an XPU driven render cluster to accelerate CGI rendering driven multi-channel image generation tasks.
This is an example of 1830 GHz of distributed pathtracing rendering happening via a bucket tile rendering approach. The image was generated interactively with an XPU-powered render cluster of 11x Supermicro AMD Opteron quad-socket 6276 rack mounted servers in 2019.
Building Cloud Instance Based Render Clusters
The CPU/GPU performance characteristics provided by Amazon AWS EC2 “g5.48xlarge” cloud computing hardware is exceptional.
This is a cost estimate for a single on-demand launched Amazon AWS EC2 G5 system running as a spot instance. This system has the balanced hardware needed to allow “thin client” like edge rendering workflows to happen on-set in a next-gen virtual production environment.
Benchmarking the Kartaverse Cloud
Listed below are the results from common off-the-shelf performance benchmarking apps. You can also run these same benchmarks on your existing on-premise workstation hardware to see how your local gear compares to Amazon and Google Cloud gear.
Google Cloud C3D Benchmarks
Google Docs | Dual Socket AMD EPYC Genoa Server with 360 vCPUs on a Google Cloud Spot Instance
Amazon AWS EC2 Benchmarks
Lightfield Virtual Production
Here are a few thoughts about “prior art” that exists in the lightfield imaging domain from the earlier 2014–2018 period of XR/VR research.
Kartaverse Journeys
A new collection of production case studies from Kartaverse backed immersive media projects is being assembled, and rights cleared for release.
Exploring the Puerto Rico Caveverse Project
This Medium article explores a long-term Kartaverse backed initiative called “Enter the Caveverse”. It was a multi-year project with a goal to advance underground digital-twin based mapping via LIDAR scanning in Puerto Rico.
You can explore a real-time 3D based 6DoF mobile/tablet optimized point-cloud version of the Las Cabras, Puerto Rico cave entrance using the SketchFab interactive 3D model viewer:
SketchFab | Explore the Las Cabras Cave in Puerto Rico
Stay tuned for more information as development work continues on building out the technology behind the Kartaverse 6 software launch in early 2024! 🤘