This year at CES I had the opportunity to meet with the Vice President of Media and Influencers from Tobii. For those not familiar with Tobii, they are the world leader in eye tracking technology, and already have several eye tracking devices on the market. During my time with them the folks at Tobii asked me to wear a modified version of an HTC Vive VR headset. This particular headset incorporates specialized hardware (and software) that enables their eye tracking technology. As part of my experience they offered to show me a series of demos illustrating how eye tracking can vastly improve the VR experience.
The first part of the demo had me picking up a stone and trying to knock over bottles that were scattered across the field of view. Without eye tracking on, there was no apparent weight to the stone and my attempts to knock down the bottles were inadequate at best. With eye tracking enabled, it was simply a matter of looking at my target and tossing the stone to knock them over. The eye tracking allowed me to focus on my target, and in-turn helped the software identify what I was aiming for. While the process to pick up and throw the stone hadn’t physically changed, the ability for the software to understand what I was looking at and what I was trying to accomplish gave a certain weight to the stone and made knocking over the bottles a breeze.
The next demo showed how interacting with an avatar on the screen can appear more lifelike. In most VR or other digital representations of a person, there is an almost hollow like expression. When you are interacting with someone in real life, you are looking into their eyes, not just staring off into space. In the demo I was able to look back and forth between two avatars who would recognize I was looking at them (not just in their general direction), without the need to move my head. Instantly everything felt more lifelike, something that VR regularly tries to mimic, but sadly still falls short.
Another demo had me sitting in a virtual theater, with a selection of movies to pick from. For those familiar with how Netflix or any other entertainment software works, it’s a matter of dragging a pointer to what you want to watch, and then clicking the button. We discussed the experience and it was described to me as a three-point system, where your eyes look at what you want to watch, you move the cursor and then you click. In VR, your forehead typically becomes the pointing device, pointing it at what you want to interact with. Pointing with your forehead certainly breaks immersion, forcing a rather unnatural response to something that is seen directly on the screen. With the eye tracking software built in, you simply look at what you want to watch and click the button, the same as how a smart phone works. With a smart phone you look at an icon and you press it, which eliminates the artificial step of having to use a mouse (or similar input device) to select it. Utilizing eye tracking technology reduces this to only two motions, look and click.
The eye tracking worked flawlessly, tracking my eyes effortlessly and quickly identifying what I was looking at. In fact, it was one of the most natural innovations to VR I had encountered at the show. Of course, there are far more uses for eye tracking than simply games, as it can also be used for research or as an assistive technology for helping people with various disabilities.
It’s difficult to convey in words what a game changer eye tracking is, and the many diverse uses for it in the future. The dream for most with VR is to have an experience that is more lifelike and natural. One that takes the user out of the real world with limitations, and places them in a virtual world with almost endless possibilities. Eye tracking can give you that experience and is one of the next major leaps in VR technology.