Is now the time to buy a VR Headset with built-in eye-tracking?
July 25, 2019
Whether or not you're already using virtual reality eye tracking technology in your work, it's definitely worth considering an investment in the newest breed of hardware. The cost/benefit ratio has never been better. We can highly recommend two paths either of which will cost about $2,000 and provide you robust and easy to use VR headset-based eye tracking. We discuss these further below.
Broadly speaking, eye tracking and virtual reality have many promising reasons for being combined. While eye tracking technologies in different forms have been available for decades, even including VR headset integrations, we're at a point that eye-tracking data is becoming no more difficult to use and collect than head and hand tracking data that's already become ubiquitous with VR gear.
Broad benefits of VR eye-tracking:
Foveated rendering: Your fovea is your center of vision and this buzzword refers to the ability to measure eye direction quickly and concentrate near that center spot and thereby reduce detail elsewhere. This tradeoff means more detail (more GPU time spent) where matters and less where your retinal essentially can't see detail in the first place. Perhaps the most important takeaway, however, is that a VR eye tracker capable of supporting foveated rendering must necessarily have high accuracy and low latency, meaning it's going to be a strong offering for other purposes such as research.
Enhanced interactions: For multi-user situations, we're nearing the point where we can expect all users to be able to share their eye in addition to head direction information which stands to strengthen the social connection and interactions created between users. For individual users, active development in user-interface methods will improve the ability to select objects at distance by combining looking direction with either hand or speech enabled controls.
Better comfort: Included in the calibration process per user, VR eye-tracking technologies estimate an important parameter (inter-pupillary distance) that informs the rendering software how to correctly animate a scene for users individual differences. Getting this parameter correct can improve the accuracy of size and distance cues in the simulated scene as well as reduce eye strain.
Data & analytics: At WorldViz, we're most excited by the fact that more users can get access to virtual reality eye-tracking technology and leverage real-time or recorded eye movement data. We believe that the data quality provided by today's commercial devices meets the needs of many researchers. Our goal is to make this data available in a standard manner and with as much low-level control exposed as possible across manufacturers.
Commercially available VR eye tracking options:To our knowledge as of this writing, there are just 3 commercially available choices for virtual reality eye-tracking in PC/VR headset-based VR that can be purchased today and used immediately. These include HTC VIVE Pro Eye (USD 1,599 standalone VR headset), Pupil Labs (EURO 1,400 add-on to HTC VIVE), and Varjo VR-1 (EURO 5,995 standalone VR headset). We are aware of no solutions for Oculus-based VR headsets. Currently, WorldViz products including Vizard support both the HTC VIVE Pro Eye (and older Tobii integrated VR headsets) as well as Pupil Labs systems.
Strengths of using Vizard for VR eye-tracking:
If you're about to take the plunge and explore the possibilities enabled by VR eye-tracking data, we invite you to try our VR toolkit Vizard. Vizard is a general purpose development environment for scientific VR, allowing researchers and innovators to build precise and complex simulations that connect to VR headsets, CAVEs & Powerwalls, head/hand/eye trackers & motion capture systems. We also support specialty devices such as tactile and biophysiological sensors such as EEG, EKG, GSR, and more. With an embedded Python interface, development is straightforward and open which means you don't have to be a computer science expert to build and you can tap a huge Python user community for numerous libraries and utilities.
Specific to VR eye-tracking, Vizard VR toolkit provides:
Hardware agnostic interface to several hardware vendors
Tight control over "motion to photon" latency
Recording and playback eye-tracking behavior for "after action review" including 3D path review
Extensive data analytics
User performance triggered feedback loops with eye or physiological sensor data
Precise timing experimental control and device synchronization
Extended support for multi-user environments
360 videos and 3D files from a wide array of sources as customizable stimuli
Recording of gaze direction data, pupil size, fixation timings, and other low level parameters
Try and see for yourself now:
You're invited to download and try and try a mini VR eye-tracking experiment and we hope you'll appreciate how accessible we make building projects like this with a few lines of Python code. In fact, here's the code that comprises 1) random select without replacement a 360 scene, 2) exposes viewer scene for 30 seconds while recording head and eye movements, 3) save data as both CSV (or optionally XLSX) format, and 4) replay the 360 scenes with eye motions superimposed.