The initial wave of virtual reality headsets generally require users to manually indicate how to interact with their new artificial world.
In the near future, however, VR and augmented reality systems will likely use lasers to automatically track wearers' eye movements and adjust accordingly.
USA Today reports that Silicon Valley firm Eyefluence pioneered a system that enables VR and AR users to select a virtual object with a glance and activate it with another.
Jim Marggraff, Eyefluence's CEO, equated his company's development to the evolution of computers from the keyboard to the mouse.
“The idea here is that anything you do with your finger on a smartphone you can do with your eyes in VR or AR,” Marggraff told the paper.
Eyefluence is one of several companies working on eye-tracking systems, along with Eyematic, Fove, Percept and SMI. Eyefluence's program is likely to debut next year, but the rest could still be years away.
Experts told USA Today that developers face a series of challenges to following users' eyes, from adjusting to pupil sizes to controlling VR systems' weight and power consumption.
Once those hurdles are overcome, however, the programs could effectively eliminate wearers' disconnect between the virtual world and the real world.
They could also allow systems to deliver targeted content or provide instant feedback to users — including first responders in life-or-death situations or manufacturers hoping to test out product designs before building them.
“We want to again change the way we interface with data," Marggraff said.