Apple knows why you lose your mouse pointer. Desktop Neo concept has another idea.
OK, so first this is pretty cool, an Apple patent that was just approved helps me a feel a little better about my inabiliity to keep track of my mouse pointer. The problem is biological, and they have patented a solution using eye-tracking. The difficulty sometimes finding the cursor on the screen, Apple asserts, can be due to Troxler’s Fading. Troxler’s Fading is the pheonomenon you notice occurring to the purple blobs if you stare at the crosshair below long enough. You’ve seen this illusion before:
We can become blind to something like a mouse cursor thanks to the neural adaptation going on with the Troxler effect, our blind spots, and the very narrow region of our vision that is actually high-fidelity.
Check out the patent for Apple’s system that mitigates these effects for the user before they begin hopelesssly scanning the screen or flinging the mouse around to get a bead on their cursor. AppleInsider has a decent article you should check out.
I saw this reported just hours after coming across another proposed application of eye-tracking to help with the point-and-clikcing.
Lennart Ziburski’s Desktop Neo concept has been making the rounds this week. There’s a lot there, but one bit of the concept describes an integration of what Ziburski prefers to call gaze-tracking together with multitouch input to afford easy, fast clicking without sacrificing pixel-perfect accuracy.
The author acknowledges that eye-tracking in practice, even with cool gear, is not precise enough to have us throw away our trackpads — so we don’t. You use it to guide pointing by eye-tracking.
Gaze at a spot to effectively highlight a circle of some size on the screen (size depending on accuracy attainable), then use your finger on a capacitive touchpad to zero in on the desired pixel you want to click on.
There is no lost cursor to locate. Where you gaze, the cursor is. As the author points out, gazing becomes the equivilent of focusing. Will we still need Apple’s solution once we’ve got eye-tracking on board?
When eye tracking becomes available in consumer devices, the mind-meld seems likely able to improve significantly for every modest increase in accuracy.
I wonder what is attainable in software with the 1280*720 camera in my 15" Macbook Pro today looking at me. UX studies are done by firms with this same hardware and supposedly are able to gain usable eye-tracking data, albiet under interrogation-room style lighting and instructions for the tester to act as if their head is in a vice.