How to Connect the HTC Face Tracker for Vive Focus Vision and Focus 3 to SightLab VR Pro and Vizard

May 17, 2025

Face tracking opens up powerful new experimental paradigms in VR—especially in studies of emotion, affective computing, and social presence. This guide explains how to integrate the HTC Face Tracker with Vizard and SightLab VR Pro using the Vive Focus Vision and Focus 3 headsets.

For details on how to Connect the Vive Focus Vision or Vive Focus 3 see these additional tutorials

How to Connect Vive Focus Vision to SightLab VR Pro and Vizard

How to Connect Vive Focus 3 to SightLab VR Pro and Vizard


Step 1: Hardware Requirements

You will need:

  • HTC Vive Focus 3 or Focus Vision headset

  • HTC Face Tracker (compatible add-on)

  • PC with Vizard + SightLab VR Pro installed

  • Latest Vive Business Streaming (VBS) software

  • OpenXR runtime properly configured

💡 Ensure the headset firmware is up to date and the Face Tracker is securely attached.

Step 2: Setup in SteamVR/OpenXR

  1. Launch Vive Business Streaming and ensure the headset is recognized.

  2. Set SteamVR as the active OpenXR runtime.

  3. Connect your headset via USB or Wi-Fi streaming.


Step 3: Use the Correct Vizconnect Profile in SightLab VR Pro

In your Python script (or in the GUI's hardware dropdown), select:

'Vive Focus Vision' 

This profile maps to the correct vizconnect_config_focus_vision_openxr.py file and ensures compatibility with the HTC Face Tracker.


Step 4: Enable HTC Face Tracking in Your Script

Import the dedicated HTC face tracking module from SightLab:

from sightlab_utils import face_tracker_data_htc

Set up tracking and begin logging:

face_tracker_data_htc.setup()
vizact.ontimer(0, face_tracker_data_htc.UpdateAvatarFace)

Step 5: Sample Experiment Code

More information here

import sightlab_utils.sightlab as sl

from sightlab_utils.settings import *

from sightlab_utils import face_tracker_data_htc

import viztask, vizact

sightlab = sl.SightLab()

face_tracker_data_htc.setup()

def sightLabExperiment():

    yield viztask.waitEvent(EXPERIMENT_START)

    for i in range(sightlab.getTrialCount()):

        # Begin facial tracking updates

        vizact.ontimer(0, face_tracker_data_htc.UpdateAvatarFace)

        yield sightlab.startTrial(startTrialText="Face tracking active. Press trigger to begin.")

        yield viztask.waitKeyDown(" ")

        yield sightlab.endTrial()

viztask.schedule(sightlab.runExperiment)

viztask.schedule(sightLabExperiment)


Step 6: Where the Data Goes

All facial expression and tracking data will be saved automatically to:

/data/[timestamp]/face_tracking_data.csv

This includes:

  • Blend shape coefficients

  • Expression data over time

  • Trial timestamps


SightLab VR Pro Templates and Examples

Try these examples from the ExampleScripts folder:

✅ Troubleshooting Tips

  • If face tracking isn't active:


    • Double-check the OpenXR runtime (should be SteamVR).

    • Make sure VBS and SteamVR are recognizing the Face Tracker.

    • Check if the HTC module is used instead of the generic face_tracker_data.

You’re now set up to collect high-fidelity facial expression data in VR using the HTC Face Tracker and SightLab VR Pro. Combine this with gaze tracking, Biopac events, and behavioral markers for even deeper experimental insights.

For more information on generating VR experiments with face tracking see here:

https://www.worldviz.com/virtual-reality-experiment-generator-for-research  

Or for information on any Worldviz products contact sales@worldviz.com 

Stay Updated
Subscribe to our monthly Newsletter
CONTACT US 
Phone +1 (888) 841-3416
Fax +1 (866) 226-7529
813 Reddick St
Santa Barbara, CA 93103