Playing Footsie With VR

Jonathan Linowes
Things I Did and Learned Today
3 min readJan 31, 2015

--

The other day I had this idea: Use a DDR Dance Mat as input to virtual reality

I am very interested in building virtual reality experiences that respond to you body location and movement. There are a number of ways to do this, including motion sensing cameras, wearable sensors, and low-and-behold, floor mats!

I have toyed around with Microsoft Kinect and Leap Motion, which contain depth sensing cameras. Their software interprets the image to identify body parts (e.g. hands) and track them as they move. This data can then be used, more or less, in an application or game.

I’m also very excited to begin exploring the new generation of wearable sensors which promise to more accurately track the position and movement of any object or body part it’s attached to. I have preordered a Sixsense STEM System as well as a PrioVR body suit, and eagerly await their arrivals in coming months.

But I’ve started getting impatient. As I play with developing my own VR content I really want to get an understanding of the issues of integrating my own body and sense of self into the virtual scene. If only I could tell my game where I’m standing, and when I move my feet. If only… I had a pressure sensitive floor mat!

Having (happily) missed the Dance Dance Revolution craze of the 2000's decade, I didn’t own a mat so I went on Amazon and picked up a cheap ($25) USB dance mat to play with. When it arrived I snatched it from my teenage daughter’s klutches and plugged it into my iMac.

From the perspective of a game, the mat is just a different controller, with ten buttons. How should I map these buttons to inputs that the Unity3D game engine can interpret? In my haste I couldn’t find the patience to figure out how to manually configure the Unity InputManager for this non-standard device and have it work from my Mac. Instead I downloaded a copy of Joystick Mapper ($5) (recommended by various posters in my Google search), and used that to map the mat’s buttons to keyboard inputs, numbers “1" through “9", “esc” and “space”. (Notably, the middle square “5" is not a button on the mat).

A limitation of the mat is it can only sense one button press at a time. If I’m stepping on two spots (e.g. each foot, or one foot spanning two squares), the last one pressed is the one the game reads. That’s ok for this experiment but it is a limitation. And obviously the sensitivity of the mat is very low “resolution”, that is, it measures about 3 feet by 3 feet, with a tic-tac-toe like grid (plus two extra buttons on the top).

Inside Unity, I created a simple scene with an array of cubes arranged much like the squares on the dance mat. I added the OVR (Oculus) first person camera to the scene. Then I wrote a C# script which reads the mat input and “lights up” the cube corresponding to the button that was pressed.

I put on my Oculus Rift, started the “game”, and… Yaay! It worked!

Then I called my daughter and asked her to video me as I jumped around.

It’s a simple experiment. It works. It’s not ideal.

Without peaking out from under my goggles, and not having any kind of visual pass-through (ability to see some real world with the goggles on), I would get a little disoriented, like stepping off the mat by mistake, or inadvertently turning a bit facing the wrong direction.

Some adjustments were needed to the height of the camera so I don’t feel too tall or too short in the Rift.

More tinkering is needed so when my foot touches a spot on the mat, it correlates better to a corresponding position in the virtual scene.

Perhaps a next step is to build some in-game calibration widgets so I can adjust these settings in-game rather than in the Unity editor or script.

--

--