Next Generation NFTs: Behind-the-Network — Breakpoint Part II Nov. 18, 2022
Following up from last week’s Behind-the-Network (BTN) — and lost in a fast moving news cycle — Breakpoint 2022 featured a window into the future of Web3. While last week’s breakdown focused on the cryptographic and governance changes coming to the Render Network, this week’s BTN provides a dive deeper into the future of NFTs and the metaverse.
Like crypto, NFT development has ebbed and flowed the last few years in cyclic, somewhat fractal patterns. Technologically, NFTs are on the cusp of an inflection point where they converge with new immersive and real time information technologies to become a revolutionary new medium for digital media, art and commerce in the coming years. This post will explore some of the technologies Render is developing to power the future of NFTs.
Immersive Streaming NFTs
Earlier this year at GTC 2022, Jules Urbach previewed the potential of “Streaming NFTs” on the Render Network. As a part of the larger NFT creation pipeline which was previewed during the presentation (which can be read about in more depth in the May 27th BTN), the opportunity to interact in a more experiential manner with NFTs presented a novel change to the format. At Breakpoint 2022, Jules expanded on the technology behind these NFTs, previewing the future of real-time on-chain generative and remix-able 3D art using Render.
As discussed in the GTC 2022 talk, the Render Network currently functions well as an NFT creation pipeline, as demonstrated by creators such as Pak, who have been able to take individual frames and other aspects of their scenes and create them as NFTs. This has allowed creators to more easily make NFTs, as well as providing a deep level of provenance to these assets. Yet, to date NFT’s have been for the most part confined to static .jpeg and .mov formats, when they could become fully interactive 3D assets, leveraging the entire Render scene graph. In order to do this, Render is developing streaming NFT technologies that bring real time interactivity to on-chain digital assets.
At its current phase, the NFTs are streamed from a live Network node, which is accessed through the browser of the users’ choice. Like the rest of the Render Network, this removes a level of platform strain by keeping functionality in-browser and handling streaming and rendering on the backend. As Jules touched on, this same process could potentially be extrapolated to allow users to stream apps in-browser through the Render Network. OS locked applications, like Octane X for the iPad or other potential additions, could even be accessed through a registered Network node and operated from any device with access to the Render Network, including mobile devices. This provides both a thin client authoring platform and a way to consume fully immersive, interactive media. Bringing it on-chain is an important part of the future of NFTs.
Real Time Remixable and Generative 3D NFTs
Outside of the ability to stream NFTs, Jules dove into the details of real time remixable NFTs on the Render Network using a Beeple Demo scene. As previously covered, almost every aspect of a scene on the Render Network has the potential to be converted into an NFT using the Render Network NFT creation pipeline. This is because the entire Render graph is hashed on-chain. As a result, scenes can literally come alive as fully immersive or interactive digital assets. The applications for NFTs are very interesting.
Generative or remixable NFTs on the Render Network allow for users to work off a scene, providing a collaborative art process between artists and collectors — with each unique work hashed and minted on-chain. As demoed, the NFTs are able to be remixed through a built in creation tool system, allowing for additional items (randomly generated, purchased, or earned through various on-chain mechanisms) to be added to the scene — for example, rare objects, lighting, coloring, camera angle adjustments, and more. Render demoed an experience where users can build their own NFT from a Beeple Everyday — showcasing how NFT creation can become an interactive process. In the demo, the NFTs provide a limited time ticket inside a streaming app to produce a unique derivation of a Beeple Everyday, using a randomized palette of objects and time. The end scene is hashed and minted on render providing a 1/1 NFT with unique granular metadata and rarity. These layers create a dynamic system for NFT creation whereby inherent value is baked into the NFTs from the very beginning.
Both of the above features are in development phases and Render will share more information in the future as they are further fleshed out. Some exciting directions are bridging the gap between generative art and 3D art.
The Future of the Open Metaverse and the Roddenberry Archive
Finally, Jules concluded the talk with a look at the work developing a canonical archive of Gene Roddenberry’s lifetime of works on-chain with the Roddenberry Estate. He described the multi decade efforts to build a complete universe of Star Trek, as told through the Starship Enterprise, which via the Holodeck — a fully immersive simulated reality — provides a Turing complete world. With deep layers of authentication, provenance, and ontological data put on-chain, Jules discussed how it is possible to develop the first living blockchain archive of one of the most important narrative universes of the 20th century. He concluded with previews of life-sized 1:1 fully immersive Starship enterprises and virtual production features, all of which are being developed as immersive blockchain tokenized experiences using the Render Network’s rendering, streaming, and tokenization toolset.
Jules closed with a call to build towards an open metaverse, echoing Gene Roddenberry’s optimistic vision for humanity and the remarkable creative tools that make it possible to realize awe-inspiring creative visions.
Best of The Metaverse
- Render has started sharing Tutorials and Guides for making it easier to use the Render Network — the first guide is now available on exporting scenes from C4D to ORBX — view it here.
- Octane artist Refik Anadol opens a show at The Museum of Modern Art, becoming the first NFT artist to be exhibited at the MoMA, the world’s leading Modern Art museum. The collection combines immersive 3D and AI StyleGAN technology showing the future of GPU rendered digital art — see NVIDIA’s post on the collection here.
- Ryan Zurrer, technologist and prominent collector of Beeple’s iconic holographic NFT, Human One, wrote a fantastic Twitter thread on the future of NFTs, immersive media, and the metaverse — take a read here.
- USDC issuer Circle adds native support for Apple Pay creating an important new rail between traditional fiat payments and the virtual economy of cryptoassets — read more here.
Next week there will not be a Behind-the-Network (BTN) with the Thanksgiving holidays, however, there will be exciting news about the Black Friday release of the new Octane subscription, which provides access to the Render Network.
Join us in the Rendering Revolution at:
Knowledge Base: https://know.rendertoken.com