Multisensory Mixed Reality Experience WALD — a Research & Development Project
WALD 🌲 — german for forest — is a multisensory mixed reality experience using HTC Vive and Leap Motion in combination with a real object.
First and foremost designed for our Research & Development endeavours, WALD took place in our pop-up Experience Room at Wilmersdorfer Arcaden as part of the project #Artcaden.
The main idea of WALD was for users to experience nature — right in the centre of urbanism: a shopping mall. To make the experience as authentic as possible we didn’t spare any effort and moved a real tree right into our Experience Room, where visitors had the opportunity to touch it with Leap Motion, experiencing it growing and coming to life alongside all the spirits and animals of the forest.
Check out the video of the event for some real live shots.
You want to know more about development details, our process and all the obstacles we have overcome? In a nutshell — here’s the story of our journey from a programming odyssey to an adventurous and exciting development project.
Disclaimer: Mainly nerd talk for VR and game developer.
Concept & Story
We believe telling a story in VR always needs some sort of justification. Therefore we are putting a lot of effort in developing a concept by means of intensive team sessions, where we talk about content and story in depth. After dismissing one idea after the other, one of our team members remembered seeing the documentary “Unsere Wälder” screened on ZDF.
The revelation: trees communicate! An incredible and invisible fact, which inspired us to come up with a VR experience that actually visualizes the invisible and energetic communication flows of trees.
We started out with a block out in Unity and took the whole team on several field trips into the forest. On these trips we began to generate 3D models of trees, stones and other assets with photogrammetry using Reality Capture.
Then we organized a tiny tree trunk as a real object and set up a first prototype.To simulate the softness of forest grounds, we put simple coconut mats on the ground, installed a redlight bulb to emulate the warmth of the sun and waved with freshly cut cornifer branches — originally to be used for Christmas wreaths — to make the air smell like a real forest.
We progressed step by step in cooperation with our developer friend Dominik Henn, by making the scene more vivid and logical as well integrating the Leap Motion interaction. Find his article on developing with leap motion here.
It turned out that our realistic approach resulted in a very uncanny look due to the self imposed time constraints as this was a mere R&D project while being extraordinary expensive in terms of computation. We had shown our first WALD demo at a Christmas dinner @VRBusinessClub and got amazing feedback for the strange but authentic feeling emulated by haptics and smell in Virtual Reality, so we were doing something right. Right?
The more vivid it got, we started having doubts, though.
Spatial Sound and Organic Lines
After the Christmas break, we decided to rethink the whole R&D project.
For one of our commissioned projects, we had just started integrating Houdini into our workflow and were mindblown by the seemingly magical power of this tool. (We can really recommend Entagma´s Tutorials at this point http://www.entagma.com/).
We rebriefed our greatly talented and patient spatial sound designer Nicolas Heese and asked the highly experienced and amazing procedual 3D artist Thomas Helzle to support us on the Houdini side. From there on, we knew we had to focus on developing a pipeline for Unity.
Inspired by the line style of ‘Dear Angelica’ we recycled our scanned forest assets as a base to generate organic lines in Houdini.
We modified the lines with the Oculus Quill Node in Houdini, providing widths, vertexcolors and timecodes. Then we read out the normals and tangents and exported the vertex coordinates within a json file. These we put together in a custom mesh and shader, which then were manipulated using the timecode. From that point, performance was not an issue anymore and and some pieces seemed to fall into place. We began to be comfortable with the look and feel of the VR experience as a whole.
Now let´s talk about our touchable object — our real tree.
Yes, a REAL tree
Being a blessing in disguise, a bad storm uprooted a lot of trees in a forest near Berlin and a friendly forest owner allowed us to take the most beautiful (and heavy) tree for our experience.
With six pairs of strong, helping arms, we managed to carry a solid trunk weighing roughly half a ton from the truck bed to our event location at Wilmersdorfer Arcaden.
There, we uplifted the trunk to the final position with the help of a mobile crane. (Thanks so much Conny, you most loveable technician of Wilmersdorfer Arcaden shopping mall for firstly not declaring us insane and secondly helping us making the tree save for our visitors.)
The key game mechanic for visitors would be touching the tree with Leap Motion, energizing their hands with the power of the tree, in order to then push that energy into the forest. The forest would then start to grow, come alive and finally reveal the hidden communicational flows of the trees — from the crown right down to the roots. In order to make this effect as immersive as possible, we made sure that the scanned tree lined up perfectly with its real counterpart.
Now to some of the technical challenges we encountered when setting up the experience. The long Vive wire was a challenge. Since we didn’t want visitors to accidentally strap themselves to our “totem pole” during the experience a XMG walker came into play. Also we needed to solve the problem of tracking loss as the 2.3 tall tree in roomscale volumes middle did create a lot of occlusion. In the end, we came up with a simple yet effective solution. By adding bushes to the scene we created obstacles in the areas that had tracking issues. This resulted in the user navigating around the spaces that were difficult to track.
At last — we were ready to test!
We invited all the artist from the #Artcaden project and creative friends for a beta session and collected valuable feedback.
Here are the funniest statements we got:
“I am a godlike mythical creature!”
“I got the power!”
“I want to go even deeper in the forest — deeper and deeper and deeper!”
“Is this trying to be a reenactment of the movie avatar?”
We detected a lot of issues in the beta test — making use of the right Leap Motion gestures and emphasizing the interactive impact seemed to be a tough nut to crack. Our team had spent so much time inside the experience and designing it, that we simply could not imagine that it would not be intuitive — since we had VR beginners in mind all along.
Also perfectly lining up the real tree with the virtual one did not quite work out as we thought it would. In addition the Leap Motions narrow FOV lead to our visitors often “losing” their hands during the experience. which, made them real uncomfortable and resulted in a considerable loss of immersion.
After identifying the conceptual and technical weaknesses of the prototype we are now going to talk about the strong suits we discovered:
Touching real objects with your “own” hands is indeed tremendously impressive and intensifies the feeling of immersion. Having “energized” hands emulates a crazy good and almost “real” sensation — the visitors reported that they somehow really felt having power in their hands. The huge tree as an anchor to the virtual world turned out to be a great and convincing concept. The smooth transformation from the real world to the virtual world helped the visitors a lot to let themselves fall into the virtual environment without confusion or feeling sick. Judging from the feedback, all visitors have been enchanted by the experience and were truly sad that it didn’t last longer.
Coming back to the office with all these fresh impressions and constructive feedback, we prioritized to improve the gestures as well as redesigning the interaction model to be more natural.
Soon after, within the last days of the #Artcaden project, we were pleased to welcome curious visitors during our 5 day event.
We provided 65 individual slots á 15 minutes. This included equipping the user with the backpack, goggles and giving them a brief introduction.
People new to VR as well as very experienced users left our experience in an very thrilled state. From the feedback we got, our decision to make use of something as relatable as an actual tree in our experience was really a great way to open a portal between reality and the virtual realm.
Through the production of our R&D project WALD, we learned the following:
1| Failing fast really is a thing
2| We improved our photogrammetry workflow
3| Built a pipeline between Houdini and Unity
4| We evaluated different interactions possible with Leap Motion
And last but definitely not least:
5| That curiosity and motivation can move mountains (and apparently trees)
If you liked our findings or found them useful in a way or the other, we’d love to hear from you. Also do not hesitate to contact us for more detailed insights:
We want to thank everybody who went on this journey with us.
Thomas Helzle — Houdini Artist
Ralph Tharayil — Writer
Denise Alheit — Artist
Adrian Azadvaten — 3D Artist
Nicolas Heese — Sound Design
Nicolas Schindler — Sound Design
Dominik Henn — Developer
Katja Hindenburg — UX Designer
The development team of Invisible Room: