Behind the Network (BTN): May 20th, 2022

A deeper look at some of the highlights we missed out on in the last GTC update

Render Network
Render Network
6 min readMay 20, 2022

--

In this week’s Behind the Network (BTN), we are doing a deeper look into Render founder Jules Urbach’s keynote talk at NVIDIA’s GPU Technology Conference (GTC) 2022. In the presentation, Jules discussed the future of the Render Network and some of the technological transformations in the next decade for rendering, filmmaking, and Web3. This in-depth look will explore some of the emerging opportunities in the next decade for a Web3 metaverse powered by holographic rendering on the Render Network.

In this Part 1 of 2, we will be focusing specifically on the rendering roadmap discussed in the talk. Part 2 will focus on areas relating to the future of Web3 and the Open, Spatial metaverse.

A Recap of the Mission and Long History at GTC

Jules reiterated his legacy at GTC — where he has been one of the leading voices in the revolution towards GPU rendering in the visual effects, film, and gaming industries — and shared his vision for bringing the magic of holographic rendering to everyone.

One of the main trends Jules discussed was the convergence of offline and real-time rendering for next generation immersive cinematic experiences. The main trend is a standardization across workflows and hardware, providing the base foundations for an open metaverse where 3D content can frictionlessly move back and forth between authoring to publishing and monetization.

The talk then highlighted recent features in OctaneRender 2021 and 2022 that are pushing forward GPU rendering across a wider variety of 3D applications and workflows. Jules also discussed the key roadmap initiatives for Octane, which included highlights such as:

Interoperability

  • Native USD support inside Octane, as well as Maxon C4D integrated noises into all Octane DCCs enables scene files to move between 3D apps frictionlessly.
  • A new Hydra Render Delegate makes Octane available for any app that supports Hydra, including an Octane/Render Network connector to NVIDIA’s Omniverse via the Hydra delegate (more on that later in this post).
  • Arnold Standard Surface support enables Arnold materials to render in Octane, providing integration into one of the largest 3D render engines — with full support for all parameters including Arnold Subsurface Scattering and Random Walk.
  • Adobe Standard Surface with Material X and Hydra Scene delegate support, provides frictionless connectivity with the Adobe ecosystem.
  • Support for Blender inside of Octane Standalone, providing a bridge to the largest open source 3D ecosystem.

Expanded 3D Apps and Services

  • A Metafaces library of facial scans was released providing premium data sets for digital character creation natively within the Octane Ecosystem.
  • GPU Based Post Processing Stack which includes GPU Based AI Filters and LensFX providing a host of new compositing and processing tools in Octane.
  • Brigade Real Time Path Tracing in Octane providing cinematic rendering for real time applications in gaming and Mixed Reality (MR).

Community Roadmap Features

The Octane development roadmap was also heavily driven by the community of Octane artists that can request and rank new features for future releases. Some new community features are:

  • Increased support of Ampere features, including faster particle rendering and 6x faster Motion Blur, as well as 2x better denoising.
  • Hardening Out of Core Geometry support, providing more stability for large scenes.
  • A Time Node which is helpful for ORBX exporting and animation sequences.

New Render Initiatives for Expanding Web3 Rendering Ecosystem

  • Multi-Stream rendering provides bi-directional rendering between local hardware and an off-premises server. This abstracts authoring workflows from specific hardware constraints and provides interoperability across hardware ecosystems. For example, with Multi-stream rendering you can link Octane processes together between Windows and iOS ecosystems like running a NVIDIA Card on an iPad with remote GPU processing. This brings cloud 3D authoring on Render Network to tens if not hundreds of millions of new mobile devices using a bi-directional rendering workflow.
  • Multi-Render brings any render engine into the core of Octane and Render Network, providing an expanded surface of users for the Render Network. Through the Hydra Render Delegate, Octane and Render Network will be able to support Arnold, Redshift, and Blender’s Cycles, reaching all of the largest 3D ecosystems. Additionally, the network has developed partnerships with Google and Azure, providing integrations with the largest cloud providers.
  • Multi-Engine embeds an entire applet inside the Octane Node system that makes using 3rd party tools seamless within Octane. Some new tools include Embergen and Sculptron, LiquidGen, and WorldCreator. Additionally, on the roadmap is full support for Cinema4D from Octane, with the ability to send renders to Render Network without dependencies on third party plugins. Finally, the team is working on the full integration of UnrealEngine 5 into Octane Core in collaboration with the team at Epic Games. The support for more toolsets and applications provides a more robust ecosystem for 3D creation on Render Network.

Vision for The Next Decade in GPU Technology — Looking forward to the 2020s.

A critical section of the talk was dedicated to previewing technologies the team is working on for the next decade in GPU computing. These technologies inform the future facing roadmap of Render, including:

  • Scene Streaming and Meshlets enables unlimited scene streaming to VRAM, removing constraints on scene sizes and memory. To enable scene streaming, the team is developing Meshlets (which is akin to Unreal Engines Nanites, but for final frame rendering) which stream geometry direct from a hard drive, bypassing VRAM and RAM. The result of these technologies is that large ultra high resolution textures and meshes (for example 128K x 128k textures) don’t need to be processed, enabling people to create rich and detailed scenes frictionlessly. These scenes can then be sent to Render Network for final frame rendering, increasing the intensitivity of work rendered on the network.
  • Neural AI Object: Jules introduced a new asset type, ready for AI, a Neural Object. What will be possible is for artists to submit a render or photographs to Render Network, which then gets processed on Render Network and returned as an AI-ready asset. This Neural Object output contains all of the properties of a traditional 3D scene but it is also an AI system. With the AI Object processed on Render Network, artists can create Neural Network based AI filters on top of a 3D asset or render. Artists can also collaborate on building AI training sets running 3D experiences from these shared training models and 3D renders processed on the network.

A Connected Creation Ecosystem for the Future and Beyond

In summary, over the past year, some of the biggest rendering advances have been in creating interoperability across 3D ecosystems — with support for Arnold, Redshift, Blender and others coming soon. This is an important building block to an open metaverse — where scenes and apps move frictionlessly from authorship to distribution and monetization. The convergence between offline and real time rendering will also be essential for the next generation of mixed reality and mobile 3D content creation — which has the potential to both bring cinematic rendering to the gaming ecosystem and also embed 3D pipelines into hundreds of millions of mobile devices for the first time. Scene streaming technologies will further remove VRAM and memory hardware constraints, providing near unlimited complexity within GPU based rendering architectures that are then processed in the cloud for final frame rendering on Render Network.

Finally, as we look to later in the 2020s, AI-based rendering and creation will be increasingly widespread, with deep layers of scene intelligence embedded into source 3D assets.

Upcoming Work

In Part 2 of this GTC Deep Dive, we will be recapping new Render Network technologies and the Render Network roadmap for Web3, holographic Mixed Reality and the open metaverse.

Join us in the Rendering Revolution at:

Website: https://render.x.io
Twitter: https://twitter.com/rendernetwork
Knowledge Base: https://know.rendernetwork.com/
Discord: https://discord.gg/rendernetwork
Render Network Foundation: https://renderfoundation.com/

--

--