Haptic channels in VRController for WebVR.

Stewart Smith
5 min readNov 26, 2017

--

Easily add multiple channels of haptic feedback to your WebVR experience with VRController.

Are you making custom virtual reality experiences for the Web? (You should probably get outside more.) VRController makes it easy to support 3DOF and 6DOF hand controllers for Oculus, Vive, Windows Mixed Reality, Daydream, GearVR, and more. You can download the source code from GitHub here, and if you’re reading this on a system with a VR headset, hand controller(s), and a WebVR-capable browser you can test drive it right now: https://stewdio.github.io/THREE.VRController/

VRController’s been available for a while, but here’s what’s brand new: Haptic channels! Some hand controllers, like the current models for Oculus Touch or HTC Vive, have haptic actuators—vibrating motors—that can pulse or buzz to provide haptic feedback for the user. For example, is the user backhanding a tennis ball across the net at someone? Well then, that moment the virtual tennis racket connects with the virtual ball would be a good time to employ some haptic feedback; to make the controller rumble for a fraction of a second to let the user “feel” the contact.

In some respects VRController is just a fancy wrapper for the Web Gamepad API, and Gamepad itself already supports haptic actuators. Once you have your Gamepad instance handy, vibrating that actuator is as easy as:

if( gamepad.hapticActuators && gamepad.hapticActuators[ 0 ]){    gamepad.hapticActuators[ 0 ].pulse( intensity, duration )
}

For reference, intensity is a floating-point value between 0 and 1 inclusive, and duration is a number of milliseconds. hapticActuators is an Array, but I’ve yet to see a hand controller that has more than one actuator. For our tennis example, the above functionality covers everything we need: The user hits something in the virtual world and we buzz their hand controller in response. Perfect.

Set and Wait

Ok, but what happens when you’re revving your plasma engines while firing off photon bolts in deep space? Slinging that bolt from the rotating cannon head is going to hit you with a sudden recoil while the low rumbling hum of the engine continues underneath. And what about the wind-down of that rotating cannon head as friction slowly brings it back to a standstill?

First, let’s look at the underlying engine hum. When the user squeezes their controller grips we’ll create a haptic channel on our controller instance called “engine-rumble” and set its intensity to 20% like so:

controller.setVibe( 'engine-rumble' ).set( 0.2 )

The act of selecting a haptic channel with setVibe() automatically creates that channel if it does not already exist. The name of that channel is whatever string you pass to setVibe(). Notice how duration is not being specified. VRController will rumble at that intensity forever—or until you issue a new intensity command. That’s as easy as selecting the same haptic channel again by name:

controller.setVibe( 'engine-rumble' ).set( 0 )

We can also create a queue of haptic channel commands. Let’s say when you first engage the engine there’s a moment of intense shuddering before it settles down into its normal hum. Perhaps that initial shudder lasts a second and a half. Here’s how we might describe that using haptic channels:

controller.setVibe( 'engine-rumble' )
.set( 0.8 )
.wait( 1500 )
.set( 0.2 )

And just for fun, perhaps it takes an eye-blink of a moment after releasing the controller grips for the engine to disengage:

controller.setVibe( 'engine-rumble' ).wait( 250 ).set( 0 )

Under the hood, VRController is keeping track of time via window.performance.now() to know when in the future each change in vibration intensity ought to take place. Important note: Selecting the haptic channel will automatically erase its queue of future events. Why might this be desirable? Imagine the haptic behavior we described above. Now imagine a user engages the engine by squeezing their controller grips, but after one second decides to release the grips. If the event queue was not automatically scrubbed what they might experience is 1.25 seconds of 80% vibration intensity, followed by a quarter second of haptic silence, then followed by 20% vibration intensity that lasts into perpetuity. That wouldn’t feel like an engine kicking on, then shutting down. It would feel like a mistake.

Multi-channel

The above pattern of selecting a haptic channel and then applying set or wait is convenient, but using multiple channels is where VRController’s approach to haptics really shines.

Let’s say you’ve got that engine rumbling and now the user is pulling the trigger on their controller to fire a photon bolt. How do you plan to keep track of where the engine’s at in terms of its vibration intensity queue, and also apply the recoil of the cannon? And what about the rotation of the cannon head—which we want to spin at full intensity immediately, then wind down over time? With VRController you don’t have to worry about it.

Your engine’s already rumbling. Here’s what to add to your trigger-press routine:

controller.setVibe( 'cannon-recoil' )
.set( 0.8 )
.wait( 100 )
.set( 0 )
controller.setVibe( 'cannon-rotation' )
.set( 0.2 )

And for the wind-down you might add something like this to your trigger-release routine:

controller.setVibe( 'cannon-rotation' )
.wait( 500 ).set( 0.10 )
.wait( 500 ).set( 0.05 )
.wait( 500 ).set( 0.00 )

You could of course make a much more granular wind-down, but I’ve found even the above coarse degree of detail gets me close enough to the haptic expression I’m looking for.

So if your engine’s humming along at 20% intensity, your cannon recoil hits at 80% intensity, and the cannon’s rotation adds 20% to that… Well that’s 120% intensity—and that doesn’t make sense. Thankfully, VRController sums the intended aggregate intensities at each moment and automatically caps the total at 100%, then sends that pulse command to the Gamepad instance. All the gory details are handled for you so you can focus on what really matters to you: making your VR experience feel just right.

I added haptic channels to VRController while working on a new WebVR demo game that I’ll be releasing in the near future through my company, Moar Technologies Corp. When the demo’s ready for release I’ll also open-source the code—so it will serve as a fully fleshed out example of how to use haptic channels in a real context.

Are you interested in furthering VRController? Jump in on GitHub. There are always bugs to fix and features to add. 😉

--

--

Stewart Smith

I’m Stewart Smith—a creative polymath in Brooklyn NY. I’m excited about spatial computing, quantum computing, machine learning, & more. https://stewartsmith.io