Unveiling the Unity VR Development Blueprint of Thoughtworks E4R project

Rindish Krishna
Thoughtworks: e4r™ Tech Blogs
7 min readFeb 6, 2024

Have you ever wondered how great it would be if you could teleport to another location? What if I say it’s actually kind of possible? Sounds unbelievable? Well, then this article is just perfect for you.

Yes, VR, or virtual reality, to be precise! That covert magical entrance is VR. Cool right? Let’s dig in deeper.

As a kid, I was always very fond of VR games. Since then, every time I spot a VR game or experience zone in adventure parks and shopping malls, I remember compelling my parents to let me play. There was this urge, or an unquenchable thirst, to learn more about VR, but never once did I even think that I would actually get to develop a VR application with people experienced in the field. Well, I guess Thoughtworks read my mind!

It all started one day when I got a call from the Thoughtworks staffing team and was asked if I had experience in Unity or C#. Although I didn’t have experience in Unity or C#, I was curious to know more about the project. “Its a Unity project to build a VR experience zone as part of Thoughtworks Engineering for Research (E4R), and we are looking for someone who is experienced in Unity,” I recall them saying.

Thoughtworks Engineering for Research E4R Practice is an initiative to apply computational methods to advance research in scientific disciplines such as astronomy, physiology, genomics, economics, and disaster response in societies. I realised what an awesome opportunity I had to be a part of developing a VR application. I let them know my interest in being able to learn Unity, after which they connected me with the project manager (PM) to get more information about the project. I got a catch-up with PM Satyavati Kharde. She explained, “With this project, we aim to replicate the drug discovery process in an interactive virtual environment.”

Learning Path

Unity is a cross-platform Game engine that has been gradually extended to VR, AR, etc. In Unity, the primary programming language used for scripting is C#. Coming from a web/app development background in the software industry, these concepts were completely new to me.

First of all, we need to understand the UI functionalities of the Unity Editor. Development in Unity is a combination of both UI functionalities and scripting. The next step is to learn scripting. To do that, it is important to learn the basics of C#.

VR development came into Unity as an extension to existing features of Unity as a game engine. Hence, most of the game development concepts are also needed for VR development.

roadmap
roadmap

C#

So, C# is a language belonging to the C-like family of languages. Its syntax is similar to Java and C++. Having basic OOP knowledge and experience in any other programming language will make it easy to learn C# quickly.

Since I have experience in full-stack web development and have worked with Java, C++, etc., learning the basics of C# was not a big challenge for me. I did not have to spend extra time learning C#. Following articles and documentation during development was sufficient to get the syntax.

Game Development in Unity

A game in Unity is basically a combination of UI assets that perform certain actions based on the script. Unity has a tutorial webpage that has learning pathways to some of the classic games. You can get a fair idea about game development from it. They also provide dummy assets to make it easy to start and to keep focus on the core concept.

I managed to complete a small game tutorial from it, Roll a Dice. This game will get you onboarded in Unity game development, starting with the project setup, moving the player, changing camera position, setting up the game environment, game object collisions, game UI, and game physics to build the game for a specific OS or environment.

Being a passive gamer that I have been since childhood, I thought of building a small 2D game in Unity as a pet project to learn Unity in a better way, which can be implemented in our VR project. Just like that, I started to build a game from scratch inspired by the classic game Flappy Bird.

The first primary job was to create the UI in Unity. I did struggle a bit there to develop the UI components properly in Unity, to be honest, but I did it! Thanks to this youtube playlist. It really helped me quite a lot.

I also needed some 2D assets for the game, which I managed to get from open-source websites like opengameart. Although the initial steps took me a while, I was very excited to continue further. It took me a week to finally complete the game based on my idea, which I then shared with a couple of my friends, and guess what? They found it interesting, and there were people who actually kept playing to beat the high score!

As the final step, I fine-tuned the game again and again to make it deploy-ready and compactible for all browsers and devices, and then I deployed it to itch.io.

Feel free to check out my game at Fluffy Bird Rino

Github link

VR development in Unity

If you are still reading, congratulations! You have the prerequisites covered. Now let’s dive into our niche, “VR development.”

It’s always good to have a real VR device for development. As an alternative, the XR device simulator can be used, but it was painful for development.

I received an Oculus Quest 2 by Meta from Thoughtworks for development. It has a VR headset and two hand controllers. It was so exciting to hold the headset in hand, like a dream come true. I did try a couple of VR games and apps, and I am not lying, it was so cool! 😜

I would say learning VR throughout and starting our project wasn’t a great idea because the topic is so vast. So, I decided to learn through the development journey of our drug discovery simulation project.

There are a couple of initial setups to be followed in order to configure a new Unity project for VR that is specific to Oculus Quest 2. I followed a Udemy course for the same.

First and foremost, we need a virtual 3D room for interaction. For that, we need a 3D model of the room, which we purchased from sketchfab and tuned according to our needs. The remaining 3D models we want are those of a scientist and a bot. We got an appropriate model for the scientist from Skechfab. However, we couldn’t find a suitable model for the robot online. So, we decided to build one with blender. Thanks to Ashutosh Dusane, a senior UI/UX designer at Thoughtworks, who created the bot from scratch!

unity IDE

After putting all this in place, we have to build the scene, or, in other words, the virtual space.

The next objective is to make the scientist and bot animate according to the storyboard script. For that, I learned animation in Unity. For basic animations, mixamo can be used for pre-trained animations.

The next step is to create user interaction with two 3D models: a ligand and a protein. Users should be able to grab, rotate, or move them using the VR hand controller and then position the ligand at the correct binding site of the protein. Ashutosh Dusane, the UI/UX designer, designed the 3D models for proteins and ligands in Blender.

protein ligand

Challenges

The challenging part was the user interaction with them using the hand tracking device. I went through the official documentation to implement concepts of hand controller actions, grab, camera angle, user locomotion, etc.

I also faced other blockers while developing the user interaction with the 3D models because of the configuration issues in the Unity editor. I got help from Neelarghya mandal, an experienced Unity XR developer.

Conclusion

VR technology has emerged as a bridge between imagination and reality, offering boundless possibilities across various industries. Its scope extends far beyond mere entertainment, delving into realms of education, healthcare, training, and beyond.

--

--