How to Connect the Vive Focus Vision Face Tracker to SightLab for Vizard and Use It in Your VR Research

April 13, 2026

The HTC Vive Focus Vision is a powerful standalone OpenXR headset that supports eye tracking, hand tracking, stereo passthrough — and with the Face Tracker add-on, full lower-face expression tracking. When integrated with SightLab, you can record and analyze facial expressions alongside gaze, head position, hand tracking, and experimental events.

For more information see this page https://help.worldviz.com/sightlab/face-tracking/ 

This guide walks you through:

  1. Hardware setup
  2. Streaming & OpenXR configuration
  3. Enabling face tracking in SightLab
  4. Logging and saving facial expression data
  5. Practical research use cases



1️⃣ Hardware Setup – Attaching the Face Tracker

https://m.media-amazon.com/images/I/51w307D8INL.jpg
https://www.ino-vr.com/librairie/Redimensionne_image/resizer.php?decode=0&imgfile=photos_site%2Fimg_article%2F1684855752vive-focus-3-facial-tracker-back.jpeg&max_height=500&max_width=500

Attach the Face Tracker Module

  1. Remove the front compartment cover of the headset.
  2. Attach the VIVE Facial Tracker module.
  3. Plug the USB-C cable into the internal USB port.
  4. Secure the cable and replace the cover.
  5. Power on the headset.
  6. In headset settings, enable facial tracking.

Important Notes

  • Without the face tracker add-on:
    You get upper-face tracking only (eyes, brows).
  • With the face tracker add-on installed:
    You get lower-face tracking (mouth, lips, cheeks, jaw, tongue).



2️⃣ Connect Vive Focus Vision to Your PC (OpenXR Required)

SightLab uses OpenXR for face tracking access.

According to the Vive Focus Vision setup documentation , you must:

Install Required Software

  1. Install Steam
  2. Install SteamVR
  3. Set SteamVR as the default OpenXR runtime
  4. Install Vive Business Streaming
  5. In Vive Business Streaming:
    • Register OpenXR
    • Enable eye + face tracking
    • Adjust bitrate (lower for WiFi)

Wired vs Wireless

  • Wired (USB-C 3.0) – Recommended for research stability
  • ⚠ Wireless requires strong Wi-Fi 6/6E



3️⃣ Select Vive Focus Vision in SightLab

In SightLab GUI:

  • Open SightLab VR
  • Select “Vive Focus Vision” from headset dropdown

You can see in the configuration file that this maps to:

'Vive Focus Vision':'vizconnect_config_focus_vision_openxr.py'

(from settings.py )

This ensures:

  • OpenXR pipeline is active
  • Eye + face tracking extensions are enabled



4️⃣ Adding Face Tracking to a SightLab Script

SightLab provides two modules:

For Meta Headsets:

from sightlab_utils import face_tracker_data

For HTC Headsets (Vive Focus Vision):

from sightlab_utils import face_tracker_data_htc

The HTC module is required for Vive Focus Vision.




Minimal Working Example (HTC)

import sightlab_utils.sightlab as sl

from sightlab_utils.settings import *

from sightlab_utils import face_tracker_data_htc

import vizact

import viztask

sightlab = sl.SightLab()

# Initialize face tracking

face_tracker_data_htc.setup()

def sightLabExperiment():

  

   yield viztask.waitEvent(EXPERIMENT_START)

   while True:

      

       # Continuously update facial tracking

       vizact.ontimer(0, face_tracker_data_htc.UpdateAvatarFace)

       yield sightlab.startTrial(

           startTrialText="Make facial expressions.\n\nPress Trigger to Start"

       )

       yield viztask.waitKeyDown(" ")

       yield sightlab.endTrial()

viztask.schedule(sightlab.runExperiment)

viztask.schedule(sightLabExperiment)

viz.callback(viz.getEventID('ResetPosition'), sightlab.resetViewPoint)

What This Does

  • Activates face tracking
  • Streams real-time expression values
  • Saves data automatically to the /data folder
  • Logs values per trial



5️⃣ What Facial Data Is Recorded?

The HTC Vive Face Tracker provides two groups of parameters:




👁 Eye Expressions (Upper Face)

EYE_LEFT_BLINK

EYE_RIGHT_BLINK

EYE_LEFT_WIDE

EYE_RIGHT_WIDE

EYE_LEFT_SQUEEZE

EYE_RIGHT_SQUEEZE

...




👄 Lower Face Expressions (With Add-On Installed)

LIP_JAW_OPEN

LIP_JAW_FORWARD

LIP_MOUTH_SMILE_LEFT

LIP_MOUTH_SMILE_RIGHT

LIP_CHEEK_PUFF_LEFT

LIP_CHEEK_PUFF_RIGHT

LIP_TONGUE_UP

LIP_TONGUE_DOWN

...

Each value is typically normalized between 0–1.

These values are:

  • Logged per frame
  • Timestamped
  • Saved to trial_data.csv
  • Usable for replay visualization



6️⃣ Built-In Face Tracking Example Templates

From the ExampleScripts folder :

📌 Mirror Demo

Maps your facial expressions to an avatar.

📌 FaceTracker_Sliders

Expression values drive real-time GUI sliders.

📌 Facial_Expressions_Over_Time

Plots expression values using matplotlib.

📌 Face_Tracking_Saving_HTC

Saves facial tracking values per trial.




7️⃣ How Face Tracking Is Used in Research

Here’s how researchers commonly use Vive Focus Vision face tracking in SightLab:

🧠 Emotional Response Studies

Measure:

  • Smile intensity
  • Brow furrowing
  • Jaw tension
  • Cheek puff

During:

  • Stimulus exposure
  • VR therapy
  • Marketing simulations

🎓 Educational VR

Track:

  • Engagement via eye widening
  • Confusion via brow lowering
  • Reaction to feedback

Combine with:

  • Gaze tracking
  • Dwell time
  • Performance metrics

🤖 AI Agent Interaction

Facial data can enhance AI avatar systems (see AI Agent docs ):

  • Mirror participant emotion
  • Adapt dialogue tone
  • Trigger contextual responses

🧪 Cognitive Load Experiments

Combine:

  • Eye openness
  • Blink rate
  • Jaw clench
  • Fixation metrics

For multimodal stress analysis.




8️⃣ Data Storage & Replay

Face tracking values are:

  • Saved in /data
  • Included in trial replay
  • Synchronizable with:
    • Eye tracking
    • Head position
    • Hand tracking
    • Grab events
    • ROI viewing

You can replay sessions using the built-in replay tools.




9️⃣ Best Practices for Reliable Data

✔ Use Wired Connection

Reduces dropped frames.

✔ Confirm OpenXR is Registered

Must be enabled in Vive Business Streaming.

✔ Use Test Script First

Validate tracking before running full experiment.

✔ Use GUI + Code Hybrid Workflow

Configure trials in GUI, extend with Python.




🔟 Summary

To use the Vive Focus Vision Face Tracker with SightLab:

  1. Physically attach face tracker module
  2. Enable face tracking in headset
  3. Install SteamVR + Vive Business Streaming
  4. Register OpenXR
  5. Select Vive Focus Vision in SightLab
  6. Import face_tracker_data_htc
  7. Call UpdateAvatarFace() on timer
  8. Run experiment — data saves automatically

You now have full multimodal behavioral data collection:

  • 👁 Eye tracking
  • 👄 Facial expression tracking
  • 🖐 Hand tracking
  • 📍 Head position
  • 🎯 Gaze object interaction

All synchronized inside SightLab.

To see how you can use face tracking in your research with SightLab and Vizard contact sales@worldviz.com 

To request a demo of SightLab click here https://help.worldviz.com/sightlab/ 

Stay Updated
Subscribe to our monthly Newsletter
CONTACT US 
Phone +1 (888) 841-3416
Fax +1 (866) 226-7529
813 Reddick St
Santa Barbara, CA 93103