VR Text Input: Split Keyboard
I built out a magnified split keyboard that maps to the HTC Vive Touchpad. It works pretty decently. Here are some notes and learnings from building this out and testing it.
This split keyboard input was largely inspired by the touch typing muscle memory we develop from typing on our phones. The first version I built out was mapped unmagnified onto the HTC Vive touchpad model. It was pretty inaccurate, but I think a robust autocorrect (that perhaps takes key distance into account) could make this version more usable.
Muscle memory isn’t enough on touchpads. We rely on our vision to track our fingers to the appropriate key when there’s no tactile indicator. (A quick way to test this is to type a full sentence on your touchscreen phone with your eyes closed– it’s tough but autocorrect helps.) So without the ability to “pre-see” where your finger will touch the touchpad the benefit of your muscle memory is severely diminished.
I decided to blow it up. Trying to map a 1:1 size input onto the actual touchpad was frustrating to aim at. Blowing up the size also increased the comfort levels significantly and seemingly improved accuracy (anecdotal), which is interesting because you’re still using the input in exactly the same way. The larger it is, the less abrupt and erratic changes feel.
Eventually I ended up with a system that uses your muscle memory to generally put you in the right place and allows you to slide to course correct. It’s generally pretty accurate, and not as slow as the “laser-pointer” style input. It also gives haptic feedback on successful key entry (touchpad unpress).
It’s worth noting the Vive has a similar secondary text input style that uses both touchpads and maps to the big keyboard. It takes a bit of practice, but it also feels really inaccurate to me. It might benefit from removal of some of the non alphabet keys in that mode.
- Autocorrect that takes nearby keys into account.
- Take the statistical odd that a given letter follows the entered letter and increase the radius (silently) to encourage taps in that direction. (For example, U has a very high probability of following Q.)
- Swype-style input across the touchpad. You’re generally sliding across the pad anyways. This could actually be a winner and could be worth testing out at a later time.
- Back to muscle memory, something that could be interesting is training a deep learning model to recognize the general tap patterns of say, the top 10,000 words. If you mapped out each word sequence and key position in a coordinate space, I assume majority of words would be fairly unique in the patterns they form across the space. This combined with a robust spellchecker could be doubly promising.
Most of the options out there are still pretty clunky, and nobody seems to have nailed it yet. Lots of interesting ideas are springing upthough.
Google’s Drum Keyboard apparently works surprisingly well. I’d imagine part of it is being able to track your own position pre-hit (something that’s pretty problematic with a pure touchpad based version since you don’t know exactly where you’re going to hit until you do). It probably also benefits from touch typing muscle memory.