Eye Tracking and Video Games
As we’ve discussed in previous blogs, eye tracking has a wide variety of potential applications. One such application that is quickly gaining traction is the use and integration of eye tracking in video games. There are many opportunities to use the hardware, software, and data provided by companies like Gazepoint to improve human-computer interactions, benefiting the creators and players of video games alike. Read on to learn more.
How Eye Tracking Tricks Our Brains
Visual tracking taps into our natural eye movements and our brain’s visual processing tricks. One of the key features that eye tracking in video games can take advantage of is saccades — the rapid darting motion of our eyes from one object to another. We employ saccades when we’re reading or looking around a room to quickly build a picture of what we’re looking at, and we can only focus on a small area at a time. Monitoring saccades allows eye tracking software to understand what we’re focused on and what we’re specifically aware of at a given time, which means designers can focus their attention on those areas rather than spend time and effort on extemporaneous details.
Aside from the physical function of our eyes, there are also a few natural tendencies we have as humans that eye tracking software can take advantage of. One example is movement and the way it automatically draws our attention. Movement can be used to subtly guide a user without their conscious notice. Another example is “joint attention,” or our tendency to look at what someone else is looking at as a social cue. Joint attention is a behavior often manipulated by magicians to direct our eyes away from the “magic.”
In both cases, the movements we make in pursuit of these tendencies are largely unconscious and involuntary. That means those tendencies – and movements – can be used and manipulated by video game designers without disrupting the user’s game experience.
What Eye Tracking Can Do
One of the increasingly popular applications of eye tracking in video games is foveated rendering. This process takes advantage of the fact that we can only truly focus on small areas at a time by fully rendering only the area of the virtual reality space the user is looking at directly.
To the user, the environment looks high resolution and exceptionally detailed, but it requires less energy and effort because the areas outside of our focus are left at low resolution. Foveated rendering gives virtual reality (VR) and augmented reality (AR) games the opportunity to expand their capabilities while lowering the cost and energy requirements involved.
The opportunities for visual manipulation, which we mentioned earlier, come into play with the use of visual cues. Movie and TV directors already use a variety of visual cues to direct the audience to pay attention to specific items or situations, and video game designers have a similar opportunity, especially with the use of eye tracking. Not only can you understand what a user is paying attention to, but you can also direct them to look or move elsewhere.
As an example, there are several VR game designers working with subtly flashing visual cues in a user’s peripheral vision to encourage them to turn their attention – and their body – in a specific direction. This way, the VR headset can prevent the user from running into walls or other obstacles within a room while allowing them to walk freely – or seem to walk freely – through the VR environment.
Eye tracking has a broad range of potential applications when it comes to usability and an improved user experience within video games. Through eye tracking and programming, a video game can utilize a player’s input to predict what is relevant to them or what action they would like to take.
One way eye tracking can contribute to a better human-computer interaction is through simplified, predictive menus. Many video games today have many options for what a user can do, all buried in a multitude of menus that a player has to sort through to find the option they want, whether it’s a new game, a new viewpoint, or a new weapon.
Through eye tracking, video game designers can simplify these menu designs. The user’s gaze can be utilized to predict what they want or need, opening a smaller, more relevant menu of options.
Integrated, Improved Gameplay
Another way eye tracking can be utilized is within gameplay itself. By tracking a player’s gaze, designers have input indicating where the player is looking, what they are interested in, and what they intend to do — a more direct option than depending on keystrokes or the movement of a joystick.
Integrating this capability into gameplay can mean giving the player the ability to move, aim, shoot, and more based solely on where they are looking and how long they are fixating on a certain area. Not only could this make gameplay better and more intuitive for an existing audience, but it also makes video games more accessible to potential players with disabilities who are unable to use joysticks or other handheld devices.
Invest in Smart Design With Gazepoint
If you are interested in exploring the possibilities of video games and eye tracking, start with Gazepoint. We offer research-grade eye-tracking technology at a consumer price point so you can start experimenting and/or implementing eye tracking into your projects effectively and at a lower cost. Place your order today!