WorldViz Vizard and Meta Quest Pro for Academic Research
July 27, 2023
Collecting eye/face/hand tracking data in VR and AR is now more powerful than ever in Vizard and the Meta Quest Pro, providing unprecedented opportunities for academic researchers.
Unlocking Meta Quest’s Data Collection Power with Vizard
Meta Quest Pro, when coupled with WorldViz Vizard’s Python-powered toolkit, represents a monumental step in data collection capabilities within a VR and AR capable headset. Whether it's tracking the subtle movements of a hand or analyzing eye movement for behavioral studies, Meta Quest Pro is a force to be reckoned with in the world of virtual and augmented reality.
Experience the power of Meta Quest Pro and embrace the world of possibilities it brings to your fingertips. Meet Meta Quest Pro, an astoundingly capable headset that merges virtual reality (VR) with video pass-through augmented reality (AR) in an affordable unit. Equipped with best-in-class inside-out head tracking technology, it accurately tracks position and orientation with minimal latency. But what’s even more impressive is its ability to accurately track hands, fingers, facial features, and somewhat less accurately eye.
Empowering Research with Python
What makes Meta Quest Pro even more powerful is its integration with WorldViz Vizard VR Toolkit, which is Python-powered. This opens up the world of possibilities for researchers who can now harness Python’s rapid development paradigm for creating experiments, collecting data, and analyzing it efficiently.
Python’s ease of learning, readability, and a massive scientific community make it the top choice for scientific research. With a plethora of libraries such as numpy and matplotlib, data analysis and visualization are at the fingertips of even novice programmers. WorldViz Vizard combines Python's strengths with Meta Quest Pro’s data collection capabilities, offering a sophisticated 3D render engine through a friendly Python interface.
Simple Experiment Creation with SightLab VR Pro
The addition of SightLab VR Pro further empowers users by simplifying the process of experiment creation, data recording, playback and visualization. It offers an intuitive interface for designing experiments, and its multi-user capabilities open up new possibilities for collaboration and data, significantly simplifying the technical challenges of running complex VR and AR experiments.
Further add to the capabilities of the Meta Quest Pro with Vizard's support with over a hundred different VR peripherals. This comprehensive compatibility ranges from physiological measurement devices and functional near-infrared spectroscopy (fNIR) to a variety of input devices, tracking devices, and displays.
Meta Quest Pro does not rely on the traditional chained kinematic model used by gloves; instead, it employs absolute coordinates for tracking. This means the wrist and all finger tracking points are tracked with exceptional accuracy, right down to the fingertips. With a data sample rate of up to 90 Hz, the level of detail captured is remarkable.
The face-tracking abilities of Meta Quest Pro are extraordinary. With the ability to record over 40 facial features and a data sample rate of up to 90 Hz, it captures the nuanced expressions which can be immediately used for research.
Meta Quest Pro’s built-in eye-tracking sensors track both eye vectors. When combined with the head tracking, this headset gives instant access to the look vector in world space coordinates or optionally the 3D regions-of-interest in the virtual or augmented scene. Although Meta’s specifications on tracking frame rate are scarce, our estimates suggest it’s sufficient for numerous behavioral science attentional studies.
Step into Augmented Reality
Meta Quest Pro’s video pass-through AR, while not its most technologically advanced feature, is a gateway to blending 3D objects into a stereoscopic, real-time video experience. With a frame rate of 30 Hz, it's not trying to compete with other dedicated AR displays (e.g., Varjo), but it provides a rich playground for learning about this field.