Hacking together eye tracking for your usability lab for less than $150
I was lucky enough to catch a talk by Ben Babcock of jet.com last year where he walked us through his lab setup for interviewing users and usability testing. Their inspiring and cozy space, made to look like an apartment and not a lab, also featured something I had never worked with before: an eye tracking setup for their test computer.
Since we were going through the process of moving the space for usability testing at my own office, I thought this would be the perfect addition! Imagine being able to track where the user’s eyes land on the page; being able to tell what catches their attention, and what gets missed as they move through your prototype experiences. Obviously, this wouldn’t be that insightful as a standalone data-gathering method, but combined with a well designed test, it could definitely enhance understanding.
Great! I was convinced. Time to browse for a device, and get it sent to the office. I quickly found Tobii, which seems to be one of the main companies in the space, and tried to find pricing on their website. I couldn’t, other than a form to contact their sales team. Hm. Usually that is not a good sign. After looking at a few different review sites, I didn’t manage to track down the exact price of the Tobii X2–30, the suggested model for usability labs, but I got the picture that it is likely priced in the $5,000 to $10,000 range… much more than I would be able to get approval for.
It was at this point, dejected and nearly defeated, that I happened to stumble across Tobii’s gaming sister-site. Not being a gamer myself, from what I gather it seems like people record or stream themselves playing video games, and include an overlay of their gaze over the gameplay… and somehow I doubted they are dropping $5k+ to be able to record themselves playing Fortnite. I guess I was right, because sure enough, I came across this little gem: the Tobii Eye Tracker 4C, available for the shockingly low price of $149. I decided to give it a shot, how bad could it be?
While I waited for it to arrive, I did some digging into what it could do and what it couldn’t. Seems like it doesn’t have the precision of some of the professional models — not a big deal for our efforts, we’re not tracking down to the pixel on most of the prototypes we build. It also doesn’t build heatmaps of gaze data for you, for that you’ll need one of the pro devices and different software — but again, since a lot of our testing involves multi screen flows, and not just testing a single landing page, this was not a showstopper for us.
As for software, you can download “Streaming Gaze Overlay”, a tool built for gamers to customize how their gaze information displays on the screen while playing video games. Thankfully, one of the features is to make the gaze data available to video recording software, but to hide the tracker ball from the monitor display — essential when you are conducting usability testing and don’t want to distract the user with a blue glowing orb that follows their eyeballs. By following the steps on the download page for the Streaming Gaze Overlay Software, and also installing a really great (and free!) video recording software called OBS Studio, I was able to get my first recording, complete with eye tracking data, up and running in less than half an hour. One small catch for setting this up is that the eye-tracking software only runs on a PC, so as a Mac only office, we had to do some digging in the IT area to find a computer compatible with the tracker.
Now, on the other side of a couple dozen usability tests using the tracker, we’ve developed a process that works well. When the user first arrives, as we explain the session and how we’ll be testing, we mention that alongside recording the screen and a webcam stream of the user’s face, we also capture eye tracking data to see what types of things catch their eye. So far we haven’t had a single user decline, but if we did we would just leave the tracker off for that session. While we set up the recording and introduce them to the desktop we use for usability testing, we run a quick calibration session using a guest profile in the Tobii software. It takes less than 30 seconds to calibrate, and is always a small bright spot for the user, as it is pretty fun making balls explode with your eyes (you’ll see what I mean if you set one up in your lab!) It’s rare that the eye tracking reveals a game changing insight, but it almost always adds depth and richness to the story as a user works with your prototype. Revealing areas a user might expect for a button or filter to be, showing whether a user even registers a notification, or highlighting an element that proves to be distracting are examples of the types of insights you might regularly uncover. To protect the privacy of the users we worked with, I won’t be sharing any of those sessions here, but I have included a sample video I took while testing out the tracker — it was amazing to see what my eyes were doing without my knowledge, even though I knew I was recording! At one point you can see me look for an ‘X’ out of the lightbox on the wrong side, and on one listing you will notice that I am apparently quite taken with a lamp…
I hope this will help you in setting up a new tool in your usability lab, and please click on the clapping hands if you have found this useful!
Note: These days it can be hard to take reviews of specific products seriously thanks to affiliate incentives. I have linked to two Tobii products in this story to help readers find the exact product I am talking about, but have no affiliate relationship with them whatsoever.