David Ariew Creator Spotlight: Interdimensional Shift [BTN: December 15th, 2023]

Render Network
Render Network
Published in
13 min readDec 15, 2023

--

A deep dive into the process behind a groundbreaking creative journey, part 1 of 2 in a series on exciting projects in new formats

Renowned for his artistry that blends skillful technical expertise with a bold and creative approach, David Ariew — affectionately known as ‘Octane Jesus’ — has made a significant impact in the world of 3D art since discovering his passion for Cinema4D and Octane Render in 2013. His impressive journey has led to collaborations with the likes of Beeple, Deadmau5, Zedd, Katy Perry, Keith Urban, Excision, and many more.

In a Render Network exclusive interview, David delves into his journey into immersive digital art, focusing on his latest work, “Interdimensional Shift,” recently featured at the Digital Art Fair Asia in Hong Kong in October.

David Ariew — aka Octane Jesus - the artist behind ‘Interdimensional Shift’ and ‘Quantum Transcendence’

What inspired you to create the “Interdimensional Shift” VR meditation, and how did you conceptualize the journey throughout the experience?

Over the last several years, David has immersed himself in a style of work described as ‘infinite mirror rooms’ that combine fractal geometric rendering, stunning lighting patterns, and distinctive soundscapes. The spark for ‘Quantum Transcendence’ came during work on a job for the country music artist Keith Urban, during which David was able to expand the ‘mirror room’ format, producing over 100 frames in a single day, a previously unthought of prospect for him. As he put it:

“After a decade of work as a 3D artist, this was the most free I’d ever felt, and it was also an extremely quick and fruitful process too.”

[“Astral Radiance”]

In his early NFT work in 2021, David built his style around pioneering the ‘infinite mirror rooms’ as a type of digital art that combines the visual immersiveness of 3D with procedural fractal rendering and geometry often seen in generative art. This distinctive form of digital art leverages digitally native media to create expansive, virtual worlds that are impossible to replicate with physical media.

Over the course of 2022 and 2023, David entered a new creative phase, experimenting with new mediums beyond 2D video outputs. This shift led to the creation of his longest piece, an 11-minute meditative virtual reality artwork. Through the encouragement of his Discord community, the artist decided to venture into VR, starting with a 360-render and progressing to more complex full 3D VR pieces. His transition into VR resulted in a deeply moving experience for himself as he created a high-resolution 3D animated immersive scene that felt like a breakthrough, opening up a new dimension in his art.

“The second scene I rendered felt so vast and intricate, I started crying. It was unlike anything I’d ever seen before and I was honestly shocked that it was my creation. I finally felt that I’d found a medium where my art could graduate beyond something people look at, say ‘cool!’ and swipe past, to something people could experience, and that could leave a profound impact on them.

I feel as if I’ve cracked a hole into another dimension, the mirror realm, and all I want to do is explore and reveal this place for as long as I live!”

This creative breakthrough was enabled by a tremendous amount of technical work that David walks us through next on recent projects including Quantum Stargate, which was auctioned at Phillips.

Quantum Stargate auctioned by Phillips in early December

What are the challenges that you faced designing an immersive art experience in VR compared to those that you have done in 2D?

“With 2D renders, I could push the look much further, and create much crazier distortion effects. A lot of the tricks in my bag like using insanely wide fisheye lenses or interesting bokeh and shallow DOF looks simply don’t translate to VR. With VR, you basically have one lens, and it’s a full 360 of the scene. I’ve also found that while shallow DOF is possible, it doesn’t look great because your eyes are used to doing the focusing for you. It just kind of looks fuzzy, blobby and weird in my opinion. I also used to do a lot of tricks that would create extra distortions like putting a mirrored or glass torus close to the lens, but because VR reveals all the true dimensions of the scene, those objects close to the lens just immediately hurt your brain, so that’s a definite no-go. Similarly, the visual language I often used for concert visuals is too energetic, and the second you try to ‘do a barrel roll’ inside VR, you’re going to make people either topple over or puke, or both.

That being said, though there are constraints, there are absolute benefits to the medium too. You get to communicate in depth, and your art turns into something that wraps around the viewer completely. Just moving the camera forward will make the viewer feel like they’re flying, and your creations can be anything from a tiny world to an awe-inspiring palace. Nothing communicates scale better than VR, and the more you play with it the more you’ll realize that you’re now responsible for the sensations the viewer experiences, which is a godlike power.”

For those wanting to try experimenting in VR, could you walk us through the technical process of creating works like “Interdimensional Shift” and “Quantum Stargate?

Select Geometry — The process begins with selecting some form of geometry, such as a cube or platonic object, though any shape could work.

“I start with some form of geometry whether it’s a simple cube, or a platonic object. . .”

Apply MaterialsA perfectly mirrored material is applied to the geometry.

“I blend between a glossy material with index set to 1 (which is actually TOO reflective alone) and one with index of 8. I dial in the shader by sliding the blend more towards one or the other, to get more or less intense reflectiveness.”

Bevels and LightingBevels are used in all mirror rooms to catch light and understand the geometry.

“I use bevels in all of my mirror rooms, because those are what catch the light and allow the viewer to understand the geometry of the space, even if it’s just a hint of light catching. . .”

LightingDavid uses objects like chandeliers or bulbs, but mostly, atom arrays with emissive textures to create dynamic light patterns.

“I often use objects like chandeliers, or lots of little bulbs to create interesting patterns of light, but most often I go with simple atom arrays that have an emissive texture within them, so that the emission gets broken up. I animate the texture within the array so that the reflections dance around, creating a ton more complexity for very little work.“

Animate Spaces — The rooms or spaces are animated from a compressed state to expansion, creating a reveal effect.

“I squish the rooms down to practically nothing and animate them expanding, which creates a surprisingly beautiful reveal effect, and I fly cameras through the spaces, some of which are hallways with changing dimensions so that at times they feel more closed off and then at other times the space opens up. ”

Adapting Camera for VR — The camera must be set to a spherical panorama in Octane with the stereo option ticked.

“From there, the interocular distance has to be dialed in, and I’ve found the default to be anywhere between 2x-10x too high, meaning that the scene looks way too small when brought into VR. Once the value is reduced though, the spaces start to feel massive!”

Preview Animation: David uses the Quest 3 for previewing animations and aims for a maximum resolution of 4500x4500 to avoid frame drops.

“I’ve been using the Quest 3 lately, which is amazing and cheap, all things considered. I’ve found the highest I can crank the resolution is 4500x4500 before I start to drop frames, but that’s also because I need very little compression as my art has tons of high frequency detail and turns to mush with compression very quickly.

I just compress to h.264 at 180mbps or so. At first I was using VBR, but lately I’ve just been going with CBR. I’m also using 30fps, mostly because if I go to 60, it’s too much data and the experience won’t play back smoothly.

I can’t WAIT for the Apple Vision Pro, mostly because I think it’ll be able to handle 8000x8000 or higher, ProRes 60fps (fingers crossed)!”

Rendering and Playback — Once rendered, the file is transferred to Oculus via a USB-C cable. The final content is played using SkyboxVR (“the best player for 360 stereo videos”).

“At first, I rendered my experience as side-by-side but I discovered that this creates more vertical resolution than horizontal (think about two portrait images squished into a square side by side, vs two horizontal images, one on top, one on bottom) so I ended up re-rendering the whole animation as top/bottom. With SBS, the center point looked fuzzy, but looking down or up was incredibly sharp. With T/B, all the resolution felt more even across the experience, and so that’s what I went with.”

Considering that “Interdimensional Shift” was designed as a meditative visual VR experience, how do you envision the convergence of art, emerging AI, and spatial media technologies playing a role in the fields of mental health and psychology?

“People would go in with the expectation of staying there for maybe a minute, but stay for 10. They’d come out with their faces changed, some elated, some relaxed, others thoughtful or even crying. People would bring their families back the next day for the experience. Someone thought they heard voices from their past and came out crying, another person felt that it relieved their existential anxiety and showed them that passing from this life to the next would be okay, and another told me that I’d discovered the Realm of the Gods, and that I needed to protect this technology and not let it get into the wrong hands!”

This outcome made the artist realize the potential of his work for healing and emotional impact, far beyond his initial ambitions in his career. His goal has shifted from creating visually “WOAAHHH” music videos and films to “fill people with positive energy” VR and immersive installations.

“This was when it really dawned on me that I’ve created something that truly affects people, that could be used for healing purposes, whether that’s in meditation, in a hotel lobby, or as someone gets treatment at a hospital. Early in my career, I never would’ve thought that my work could be used for such a beautiful purpose.“

How David envisions the convergence of art, emerging AI, and spatial media technologies playing a role in the fields of mental health and psychology:

“Of course, this is all thanks to the amazing technology that’s at our disposal nowadays. VR combines so many of these inventions together into one device, and it really is like magic. AI is evolving so rapidly that it’s making our collective heads spin, and I can’t even speak to the infinite possibilities on the horizon here. It’s all very unpredictable, incredibly exciting, and terrifying all at once. But I do think all these technologies can be used for good, to help people’s mental health.

One final comment I’ll leave you with from the art fair is someone who said ‘wow, I never knew technology could be spiritual before.’ We can actually craft spiritual experiences here that uplift humans and bring them into new realms never before imagined.”

https://twitter.com/rendernetwork/status/1710448315727319191?s=20

How did the Render Network support your creative process and the final output of your VR experience?

“The first time I rendered my VR experience with my 8 x 3090s, it took over a month. . .”

David’s rendering experience all changed when he was given the opportunity to re-render the entire experience as top/bottom using the Render Network.

“It was incredible seeing 300 machines spool up at once, each taking a single frame. The process was literally orders of magnitude faster. Shots that would take my machines several days to finish would be done in a couple hours. That’s a future I can get behind!”

Despite the major advancements in technology, the artist mentioned that rendering continues to be a significant challenge in his creative workflow, especially for VR projects.

“Sadly real time render engines don’t meet the level of quality that I demand in my work, at least not yet. Rendering is still a limiting process, and VR makes it that much more time consuming.”

During the interview, David Ariew expressed deep appreciation for the power of the Render Network:

“I think what I’m saying is that I love Render Network and there aren’t enough tokens in the world to satiate my rendering desires!”

The technology of the Render Network enables artists like David to expand their horizons into more immersive creative installations.

“I fully believe that the Render Network will become an integral tool with immersive formats, simply because the resolution demands are insane. We’ll soon be at the point where we need to be able to quickly create 8K x 8K content, or even higher, whether that’s for VR, concert visuals, digital billboards, or massive installations. This is where we’re headed and I’m all for it! RNDR is very straightforward though, and personally I didn’t even need a tutorial to get up and running. It’s important to test a few frames to make sure everything has ported over to RNDR properly and there are no mistakes before hitting the final go button, but other than that there are very few gotchas!”

What’s one tip or piece of advice that you would give to artists creating in immersive formats and considering using Render Network? What should they know to best leverage the rendering service for such projects?

“As for a piece of advice, I would say do your best to pre-visualize whatever experience you’re designing. For my immersive experience, that just meant finding out the dimensions of the room it would be displayed in and recreating that in C4D, then bringing in the renders for all four walls as image sequences and putting them into the room as emissive textures. Then, I created two fully reflective figures to stand in the room and watch the experience, and “shot” it from a variety of camera angles, essentially to create the digital version of the experience.

The less people need to use their imaginations to understand what’s being created the better. It’s the same workflow for concert visuals, where as an artist you’re designing for the entire space, often on many screens, and syncing actions and moments between screens. Nothing beats being there in person, but if you can recreate the space in 3D, that’s the closest thing to actually being there.”

Where does David see the future of spatial, immersive experiences and web3 heading?

“While it’s extremely hard to predict the future, we’re already seeing the popularity of experiences like Meow Wolf, Artechouse, and TeamLab explode, as well as The Sphere in Vegas.

This is only the beginning, and once people get a taste of how technology and art can create something magical IRL, they’ll begin to crave more. Then there’s the Apple Vision Pro, which is unparalleled and will hopefully blow the doors open on mass adoption of VR, much like the iPhone did for handheld computing. It’s positioned as a high end device that will actually help with productivity, rather than just something to play games on, and I think that’s incredibly exciting.”

“Much like how VR had an initial burst of momentum and excitement and then plateaued for several years, web3 did the same, but it’s during those slower times that real innovation and invention happens, and once we’re all in VR in something resembling the metaverse, digital ownership will make a lot more sense for the average person. We’ll be able to easily wear and show off our art pieces, and tokenizing VR experiences won’t be an awkward thing anymore, because we’ll be navigating the internet in VR / AR! I think it’s human nature to want to own things, and we can already see that in games like Fortnite and Roblox. Kids are very used to digital currency and buying items, and once the points of friction with purchasing items on the blockchain are removed and it’s extremely intuitive, more people will jump in.

As an artist, I can already imagine combining immersive IRL experiences, VR, and web3. For instance, what if I created an immersive LED hallway, that opened up into a huge immersion room with mirrored floor and ceilings, and in the center of the room are pedestals that hold VR helmets covered in mirrors? The experience would be designed to take people on a journey down the rabbit hole, further and further into my art. And someday, I hope to design for Jules’ holodeck, where no headset is needed and a group can enter a space of mine and feel like they’ve entered a glass box that’s transporting them between dimensions, because they’ll be able to see out into infinity. I can’t wait for that day and I want to go there!”

Are there any projects that you are currently working on or planning that will leverage similar technologies or concepts — what’s next for you?

“Lately, I’ve continued designing concert visuals for big artists like Drake, Travis Scott, and Zedd, and I love doing client work in this realm because it’s often the most artful and I’m allowed the most creative control, and the visuals end up contributing to a larger-than-life experience. I’ve also just finished my first project for The Sphere for Coca-Cola, which was amazing, and now my biggest goal is to get my personal artwork up there, because I know how amazing it would look and how positively it would affect people visiting. I’ve also recently converted my immersive experience for a nightclub in Hong Kong, and I love the idea of people relaxing in a space wrapped in my art. Finally, I’m continuing to design VR experiences, and it’s absolutely my new favorite medium. The one I’m working on now has a slowly rising camera, which creates a sensation of floating, and I’m insanely excited about it!”

https://x.com/Ry_Hawthorne/status/1723025103586566160?s=20

Join us next week for Part 2 of our 2-part series on exciting projects in emerging spaces, with another exclusive interview from a landmark creator, Brilly, to discuss his work for the Sphere created on Render.

Join us in the Rendering Revolution at:

Website: https://render.x.io
Twitter: https://twitter.com/rendernetwork
Knowledge Base: https://know.rendernetwork.com/
Discord: https://discord.gg/rendernetwork
Telegram: https://t.me/rendernetwork
Render Network Foundation:
https://renderfoundation.com/

--

--

Render Network
Render Network

https://render.x.io Render Network is the first blockchain GPU rendering network & 3D marketplace.