GTC 2023 Recap + Stable Diffusion + ORBX Exporter 2.0 (Behind-The-Network [BTN])

Looking back at this year’s GTC 2023 talk and recent Render news

Render Network
Render Network
7 min readApr 5, 2023

--

GTC 2023 was an important event in computing history taking place simultaneously with the launch of GPT-4, which ushered in the rise of consumer AI as the next era in the information revolution. Throughout the years GTC has introduced technologies that have shaken the industry, some of which have been previewed by OTOY and Render Network’s own Jules Urbach, who once again took the (virtual) main stage this year for a keynote presentation titled, Rendering the Open Metaverse: Real-Time Ray Tracing, AI, Holographic Displays, and the Blockchain, and participated on a panel with 3D luminaries, The Future of GPU Raytracing.

The talk showcased emerging Render Network technologies at the convergence of AI, holographic computing, and blockchain — presenting a unique vision for where AI will intersect with the future of immersive 3D media. While you can view the talk in its entirety here (you need to create a free account to access), this blog focuses on key Render Network updates, and will be expanded on with more comprehensive deep dives in future posts.

Multi-Render Expanding the Network

Over the last year the Render Network has continued to expand its offerings, working to integrate third party render engines and software, expanding the toolsets that are supported on the network’s decentralized GPUs. Last year saw the announcement of a long-term working partnership with Maxon to bring C4D and Redshift to the Render Network. Since then, work has continued and has started to bear fruit, with tests running C4D and Redshift natively on the Render Network now being previewed.

Showcase of Third Party Tools like C4D Running Natively on Render

On top of the working additions of Redshift and C4D, work has been undertaken to bring NeRFs (previously covered in an earlier BTN) to the Render Network. In this case, NeRFs and Lightfields will be interacting with the Render Network on the backend, where they will be functionally created, while they will be editable in Octane on the frontend. These holographic assets rendered on the Render Network and tokenized as intelligent media objects can become the primitives of the next generation of the spatial web as the industry moves to AR and holographic computing.

Example of a NeRF being interacted with in Octane

Augmenting these additions, work on adding Stable Diffusion to the Render Network has been accelerating, with previews of the Web App interface for creating Stable Diffusion jobs on the Render Network below. While currently not running on decentralized nodes, this is a major initiative to provide users the ability to scale AI jobs across thousands of nodes.

Testing version of Stable Diffusion on the Render Network

Stable Diffusion jobs will be able to be run off of previously rendered jobs in a user’s Render Network account, as well as giving artists the ability to edit and remix the stable diffusion elements using the in-app photoshop app, Photopea. These integrations are the first of potentially more involving AI, with the Render Network possibly expanding out to include Large Language Model programs like ChatGPT, or users allowing their work to be used in training for large AI models and receive compensation in return directly through the Render Network interface using high throughput micro-transaction royalty flows. Once again the combination of generative AI media and holographic formats will help define the future of immersive computing and perhaps completely new modes of human creativity.

In line with further expanding the Render Network toolset for artists, the network’s SDK has continued to be expanded, providing a platform for developers and creators to publish services using the network. All of these updates are showcasing the continued efforts of the Render Network to push the boundary of decentralized GPU computing and next generation immersive media experiences.

Render Network NFTs

Screen grab from Jules’ talk featuring Render Network NFT creators Beeple and Pak

The Render Network emerged as a main platform for artists to create 3D NFTs over the last few years. Unlike any other render farm around, on Render Network all works can be natively tokenized with underlying scene graph data encoded into a job and NFT. At GTC, Jules previewed some of the applications of NFT technology on the Render Network — moving 3D NFT creation from static media objects like .jpeg and .mov files to fully immersive, interactive, or novel 3D generative works.

At a basic level, every job created on the Render Network could be made into an NFT through high throughput minting, tokenizing each individual frame produced on the network on-chain. However, just making NFTs through the Render Network has never been the end goal. As previewed last year, the potential for interactivity in NFTs, with fully immersive or interactive streaming NFTs running on the Render Network, powered by RNDR tokens. For example, these NFTs could be consumable 3D experiences paid for in RNDR, similar to a location based entertainment experience.

Preview of Streaming NFTs built on the Render Network

As previously showcased as a prototype with Beeple, inside these immersive streaming experiences, audiences can mint unique works or remix scenes (almost like a 3D painting tool) and create a custom work within a 3D scene.

Beta of Render Network’s NFT Remix Suite

Future BTNs will provide a comprehensive discussion of NFT technologies developed on the Render Network alongside the technologies presented at GTC.

RNP 002 and 003 Open For Snapshot Voting

On Monday at 3PM ET / Noon PT, RNP-002: Layer 1 Network Expansion and RNP-003: Resource Acquisition and Allocation for Core Team and Grants opened for a 72 Hour community snapshot vote, ending Thursday 4/6 at 3PM ET / Noon PT. Please read both proposals for detailed discussion of the RNPs.

As a summary, RNP-002 examines the need for the Render Network to move to a more high throughput L1 blockchain technologies, and proposes Solana as the most viable blockchain due to features like high-TPS (transaction per second) execution that will enable the network to build a unique micro-economy based on the network’s scene graph technology. Additionally, Solana’s NFT compression technology enables emerging use cases — for example, tokenizing final frames and the entire scene graph of render jobs that, as discussed in the NFT roadmap at GTC, is a vital element for creating next generation NFTs that move beyond static media files.

Meanwhile, RNP-003 proposes a funding mechanisms for the Render Network Foundation to accelerate implementation of the Burn-and-Mint Equilibrium Model (BME) that was ratified with RNP-001. As presented in RNP-003, the acceleration of the implementation of the BME is discussed as an important priority for enabling the network to scale to meet user needs across the spectrum of network stakeholders.

Voting for both RNP-002 and RNP-003 closes on Thursday 4/6 at 3PM ET / Noon PT and you can access the votes at the links below:

Improved Cinema4D ORBX Exporter

Based on feedback from the creator community, a major priority for the Render Network has been to increase the robustness of the Cinema4D ORBX exporter, which is the main pathway for users to access the network. A host of improvements have been released increasing usability, stability and speed — which are recapped in a set of Knowledge Base Guides and Tutorials linked below. With an improved exporter, it will be easier for a wider set of creators to leverage the Render Network’s near unlimited GPUs.

The improvements made to the Cinema4D Exporter will continue to be expanded on, creating a more robust user experience for C4D creators, as well as distributed more broadly across the DCC software ecosystem that Render Network supports, which includes over 20 plugin integrations from Gaming and VFX to Archviz and Design.

Best of the Metaverse

  • Corridor Digital, one of the leading, boundary pushing 3D studios and YouTube creators, cited Render Network as being a critical part of their workflow
  • Beeple’s follow up to Human One, an ever evolving 3D holographic installation, s.2122, was acquired by the Deji Art Museum — showing the future of next generation immersive real-time rendered holographic NFTs. The poignant work visualizes a future of humanity’s climate displacement into deep see clusters, where humanity adapts to the reality of rising sea levels in a unique Beeple-esque vision of the future.
  • Refik Anadol’s “Unsupervised” Closes at MoMA, concluding the first exhibition of NFT artworks at one of the world’s leading modern art museums. Bringing this blog full circle, Refik sat down with MoMA curators to discuss “Unsupervised”, the future of immersive AI artwork, and next generation NFT technologies at GTC.

Whats Next?

Next week we will be recapping the snapshot votes on RNP-002 and RNP-2. Please remember to vote before Thursday 4/6 at Noon PT / 3PM ET!

Join us in the Rendering Revolution at:

Website: https://render.x.io
Twitter: https://twitter.com/rendernetwork
Knowledge Base: https://know.rendernetwork.com/
Discord: https://discord.gg/rendernetwork
Render Network Foundation: https://renderfoundation.com/

--

--