Behind the Network: June 17th, 2022

The Render Network’s Jules Urbach Visited Mograph to Discuss the Future of the Open Metaverse and the Network’s Decentralized Governance Process

Render Network
Render Network
8 min readJun 17, 2022

--

Earlier this week, the Render Network’s Jules Urbach sat down with the Mograph team, the leading 3D graphics podcast, to discuss the future of 3D rendering, the open metaverse, and the Render Network’s new community governance process.

The discussion kicked off with an overview of the Render Network’s new rendering, streaming and NFT generating technologies, then proceeded to explore the frontiers of Artificial Intelligence, holographic technology, information theory, and quantum computing. Finally, the show ended with an outline of Render Network’s new community governance structure, discussing how it is an important step for realizing the potential of the open metaverse.

Next Generation Rendering and NFT Technology

Jules kicked off the podcast discussing some of technologies outlined in his NVIDIA GTC (GPU Technology Conference) keynote talk, and elaborated on the integrations between Octane, the Render Network and Unreal Engine to bridge the gap between cinematic and real time rendering. He showcased how Octane and the Render Network have been used by leading film studios on virtual productions — for example the UEFA Champions League opening ceremony — and discussed how real-time visualization is becoming increasingly standard in Hollywood productions

The conversation then moved to explore Render Network NFTs, which are focused on bringing full 3D scene graphs on-chain as ORBX files. Jules discussed how the Render Network is working with Metaplex to extend NFT file types to include ORBX files and are pairing this with OTOY’s X.IO streaming technology, to enable artists to create interactive NFTs and tokenized real time streaming experiences. David and Matt from the Mograph Team discussed how interactivity in NFTs would be a game changer, where ORBX files can be updated with OSL (Open Shader Language) code, creating a dynamic experience that could adapt to the time of day or other user generated inputs. Jules proceeded to discuss how Beeple’s Crossroads is a good example of a dynamic NFT, where inputs like the winner of the 2020 Presidential Election change the state of an NFT.

Jules then outlined some R&D work pushing the boundaries of dynamic NFTs, including being able to feed hashed wikidata into the render graph that artists can pull in as a supernode in Octane, acting as an oracle feeding in data that artists can leverage when making NFTs interactive. He also discussed how all the assets in a scene graph, and even the entire history of work produced on the Render Network is hashed, giving a deeper level of on-chain provenance and semantic data to all work produced on the Render Network. This is important because it can enable a more vibrant marketplace on the Network, where not only completed works can be monetized as NFTs, but also assets within a scene graph can be monetized using royalties or revenue sharing.

Artificial Intelligence, Information Theory, Holograms, and Quantum Computing

The next part of the talk focused on emerging trends in computing — ranging from AI and holograms to quantum informatics — exploring the ways the Render Network could integrate AI graphics and simulation into the network over the long term. Jules kicked it off by discussing efforts to develop Neural AI Objects, Artificial Intelligence-based 3D digital assets . These assets are created by feeding a 3D scene, model or other asset into the Network, and then running it through AI training models with thousands of other assets to create a 3D object that can be integrated natively as 3D inputs to AI image generators like Dall-E or Midjourney. Jules then discussed how mesh to AI processing on the Render Network would be important for the future of procedural rendering and generative art, discussing how it can be transformed in interesting ways using Octane’s Vectron fractal rendering pipeline.

While AI and Neural rendering sound far off, Jules discussed how they are an important part of the holographic imaging stack that will define the media landscape in the future. With AI processing and neural objects it is possible to create a perfect spatial simulation of a 3D asset, where Artificial Intelligence drives the casting of rays and bouncing of light. This “neural radiance field” as Jules described is a compressed version of a lightfield, which uses AI to build full spatial intelligence into a 3D object — that can then be simulated with full physical accuracy using OctaneRender and Brigade engine path-tracing.

While AI assisted holographic rendering has enormous potential, fully rendered ultra realistic holograms will also be important for location based entertainment, concerts and billboards. Jules discussed the rendering requirements, which are up to 10 gigapixels and 100k x 100k images that are needed to create fully lifelike holograms. Jules expressed optimism that GPU doubling cycles and 1-nanometer chipsets will make the technology viable in the mid-2020’s. With volumetric ORBX files enabling hybrid cloud pre-rendering and real-time on premise rendering or high bitrate streaming, he outlined to the Mograph team how the technology will be available in specialized applications sooner than mid-decade.

Jules concluded the future of computing discussion by describing how artificial intelligence will not replace human creativity but instead enhance it, providing new modalities for creating art — in the same way that photography did not replace painting, nor did rendering replace photography. He also described how quantum computing would also not replace rendering, because the technology is much better suited for large scale simulations rather than deterministic rendering, where multiple possible outcomes can be computed simultaneously rather than sequentially. Nevertheless, Jules described how quantum information is an interesting lens from which to view the metaverse, making us question and probe the human condition at a deeper level. For example, the metaverse may be seen as a quantum reality, or an information state system that collapses in ways that we don’t fully understand from a classical perspective, containing an indeterminate state of possibilities via focus like superposition and entanglement.

The Future of the Open Metaverse

At this point in the talk, the discussion pivoted to the future of the Open Metaverse and the new efforts to create community governance for the Render Network that enable the network to scale and become a base layer in the emergent communication and information landscape of the future. Jules described the metaverse as a new spatial way of organizing information along 3D principles as opposed to hypertext, essentially organizing information spatially almost like a “universe browser” when compared to a web browser. He described how the idea of multiverses, a connected universe of things, is an appropriate way to view the metaverse — where you can take a reality and observe it from many different angles. As an example, Jules pointed to the Roddenberry Archive which is, for example, recreating all the different Starship Enterprise designs and models, and the entire narrative arc of how it was used in Star Trek as well as other realities that it inspired or intersected with. While the Enterprise is getting recreated as a 1:1 lifesized digital model using the enormous compute power on the Render Network, it will also have a fully annotated web of documentation about its history that can be explored and contributed to spatially like a fully immersive Wikipedia page.

As the Web has shown the limitations of centralized entities organizing information or the control of networks, Jules discussed how the decentralized Render Network governance process works, and why it will help the Network scale to become a fundamental component of the metaverse. He described how the Mozilla Foundation, developed by Render Network advisor Brendan Eich, is a good analogy in providing an open and decentralized governance framework for the web. The Render Network has, as a result, been contributing to open standards forums and open source technologies like ITMF to provide the base computing and metadata infrastructure for the metaverse.

Jules then described how the Render Network is decentralizing its governance to grow beyond its initial utility as a peer to peer GPU rendering platform to provide an expanded set of services and applications developed on top of the network. For the network to grow, it needs to expand its token design to incentivize third party development and ownership, and as a result has been working with partners like Multicoin to expand the network’s tokenomic robustness. The Network introduced Render Network Proposals (RNPs) where community members can contribute their expertise and ideas or provide feedback in improving the network’s governance. An open, transparent and decentralized system outlined in the RNPs will more effectively pool knowledge and expertise among a p2p community, allowing the network to reach a much wider base of users and applications. To that end, the network is also releasing an open SDK for developers to build new products, experiences and technologies on top of the network’s GPU computing power in a permissionless way. For example, work has already been underway to bring in partners from across the 3D ecosystem — from other leaders in rendering like Autodesk and Maxon to standards groups like glTF, ITMF and Khronos — to access the Render Network SDK and contribute to the network’s future.

Jules then described some of the near term priorities for the Render Network like automating payments, removing the cumbersome bridging from Layer 1 to Layer 2, as well as mechanisms for feature requests and how to submit future RNPs by using the new community Discord. Jules also described with Matt and Dave some of the progress on the new ORBX exporter in C4D that has been developed in response to user feedback. For example, the exporter now shows exporting progress, as well as can be exported in the background of rendering. So for example, if you render a low sample version of an animation, and it is approved by the client, the ORBX file will already be exported and, as a result, ready to be uploaded to the Render Network where an artist can render the sequence at high samples and resolution on thousands of concurrent nodes on the Network. Jules ended by describing some additional functionality in ORBX like the video compositing system, as well as work to do ORBX exporting from Unreal Engine in which real time physics can be exported. While the Render Network is working on native Cinema4D integration, ORBX will always be a background toolset for archiving scene data and providing a continuous rendering distillation path for rendering, ensuring that your renders don’t have third party dependencies, which may be deprecated in the future as companies and software change priorities.

The move to a decentralized governance process, Jules concluded, would help the network achieve the ideals outlined in the Photon Driven Economy blog post released at the inception of the network in 2017 — providing the foundation for a vibrant and democratic marketplace for artists to realize their creative visions in post-industrial information economy as well as for node operators to derive intrinsic value and productivity from each GPU computing cycle.

Upcoming Work

Be on the lookout for our next BTN, where we’ll be featuring a Render Network community member in our returning “Creator Spotlight” series!

Join us in the Rendering Revolution at:

Website: https://render.x.io
Twitter: https://twitter.com/rendernetwork
Knowledge Base: https://know.rendernetwork.com/
Discord: https://discord.gg/rendernetwork
Render Network Foundation: https://renderfoundation.com/

--

--

Render Network
Render Network

https://render.x.io Render Network is the first blockchain GPU rendering network & 3D marketplace.