How to Use the External Application Data Recorder in SightLab VR Pro
April 23, 2025
Watch this video to see External Application Data Recorder in Action.
Unlock advanced data recording for any VR application, including SteamVR games, Oculus apps, or Unity and Unreal VR experiences (with limited support for stand alone devices)— even when they aren’t built with SightLab VR Pro.
The External Application Data Recorder bridges the gap, enabling synchronized gaze, head movement, physiological data, and more, all while using external VR content.
Also works with screen based eye trackers (Tobii and Eyelogic).
For more information see the official documentation page here.
What is the External Application Data Recorder?
The External Application Data Recorder in SightLab VR Pro lets you capture detailed data while running external VR applications. This includes:
SteamVR games
Meta/Oculus apps
Web-based VR experiences
Android standalone apps (without eye tracking)
Unity and Unreal Applications
Standard Video Games/Applications
You can synchronize sensor data and even replay the session with a visual gaze overlay, making it ideal for researchers needing data from applications outside of their SightLab environment.
Supported Hardware
The External Recorder supports a wide range of hardware:
Vive Focus Vision
Meta Quest Pro
Vive Focus 3
Varjo XR-3 / XR-4
HP Omnicept
Vive Pro Eye
Meta Quest 3 / 3S / 2 (head tracking only, no eye tracking)
EyeLogic One
Tobii Fusion
Additionally, it works in generic SteamVR and OpenXR headset modes. (Performance may vary depending on device.)
What Data Can You Record?
Here’s a glimpse of the data collected:
Eye Gaze Position & Rotation (combined and per eye)
Fixations, Saccades & Dwell Time
View Counts (Experimental AI Analysis)
Head Orientation
Pupil Diameter (Varjo, Omnicept, Vive Pro Eye, Pupil Labs)
Eye Openness Values
Heart Rate & Cognitive Load (Omnicept)
Facial Expressions (Meta Quest Pro)
Custom Event Triggers
Biopac AcqKnowledge Physiological Data
Video Recording
Timestamps & Synchronization Data
How to Set It Up
1. Install Required Software and Python Libraries If you want to record video alongside data, make sure these libraries are installed:
pyautogui
pywin32
pygetwindow
opencv-python
mss
numpy (already included with SightLab)
tkinter (install via Vizard)
You can install them via the Vizard Package Manager.
Additionally, you will need the K-Lite Codec Pack installed (can get it from here)
2. Verify Eye Tracking Connection In the External Application Data Recorder folder you can run eyeTrackerTest.py to verify eye tracking driver connection. This only needs to run for a few seconds, but ensures connection to something like the SRAnipal driver for the Vive Focus for instance.
3. Launch Your VR Application Open the external VR app you want to record data from. This could be a SteamVR game, web browser VR experience, or standalone app.
4. Configure the Recorder Edit the Data_Recorder_Config.py file to enable or disable:
Biopac Integration
Face Tracking
Network events
And more
5. Start the Recorder Run the Data_Recorder_External_Application.py script. Follow these steps:
Select your VR application from the dropdown (all open windows will appear).
Choose recording duration.
Select your hardware.
Click inside the Data Recorder window and press SPACEBAR to begin.
Note: For screen based eyetrackers, use the “Screen_Based” folder
Recording stops automatically when the timer ends (you will hear a beep when it ends), or you can quit early by pressing SPACEBAR again.
6. Replay and Analyze the Data Run DataRecorderReplay.py to:
Watch the replay with gaze overlay.
Press 1 to start recording the replay with gaze overlay and 2 to stop.
The video will be saved in the replay_recordings folder.
You can also import this video into Biopac AcqKnowledge to align physiological data with gaze behavior.
You can view the raw data in the data folder
It is good practice to run a practice test by looking at a few spots in a scene in your application, then running the replay to verify the gaze point is accurate. If it seems off you can fine tune the position of the gaze point in Data_Recorder_Config.py (REPLAY_SCREEN_CALIBRATION moves the gaze point in X,Y,Z while the VIEWPOINT_POSITION and EULER settings move the virtual screen’s position within the window.
7. AI-Based Auto View Detection (Optional)
If you'd like to experiment with automated object view counts:
After saving your session, replay and record the gaze point video.
Run convert video to images.py to extract individual frames.
Use Auto_AI_View_Detection.py to analyze objects viewed in each frame using OpenAI’s API.
Dwell time is determined if an object appears across consecutive frames.
Note: This requires an OpenAI API key, and token usage may add to costs. It's experimental but powerful.
Additional Extended Features
Rating/Likert Scales & Surveys Easily collect participant feedback. Customize scale labels and capture responses programmatically and in data exports. Ratings must be collected before or after the external session.
Inputs/Demographics Gather participant data (e.g., age, ID, gender) before starting the external session
Adding a Label/Condition Tag sessions with experimental conditions for sorting and analysis.
Flags, Network Events, Button Clicks Enable logging of custom triggers (e.g., spacebar presses, network signals) during the session for synchronized event tracking.
Speech Recognition(optional) Record microphone input
Transcriptions Combine mic recordings with post-session transcription tools to create searchable dialogue data.
Instructions Show instructions or display guidance on the mirrored desktop before launching the external app.
Plotly for Additional Data Analysis Replay session data with built-in Plotly tools to visualize data
Face Tracking and Expression Analysis Automatically capture facial expressions with supported headsets (e.g., Meta Quest Pro) if enabled in the config.
Screen-Based Eye Trackers Use Tobii or Eyelogic screen-based trackers for gaze logging during desktop-based external applications.
Average Physiological Data Biopac integration allows tracking and averaging of heart rate, skin conductance, and cognitive load throughout the session.
Baseline Record a short “resting” or neutral task before launching the external app to establish baseline physiological readings.
Pro Tips & Considerations
Window not Switching: If the application fails to put the other application’s window into focus, you may need to use Alt+Tab to manually switch the window. Another option is to use SteamVR to run the application and then use “Display VR View” to mirror the window and use that as the window to run the application on.
Standalone Apps: For casting-based VR sessions (e.g. Meta Quest), you’ll only get head orientation, not eye gaze.
Facial Expression Analysis: You can visualize this data with facial_expressions_over_time.py.
Replay Calibration: If gaze alignment is off, adjust REPLAY_SCREEN_CALIBRATION values in the config.
SteamVR Users: Minimize the SteamVR window so it doesn't appear over your video.
Replay Controls:
Press B/N or C/V to skip through replay.
Slider dragging doesn't move the video.
For Vive Focus Vision and Focus 3 — use the SRAnipal driver (not OpenXR)
View selected data with the provided Plotly visualization script.
Final Thoughts
The External Application Data Recorder makes it possible to conduct robust, synchronized behavioral studies in any VR environment, not just those built in SightLab. Whether you're running a cognitive study in a game, or analyzing gaze patterns in a web-based VR app, this recorder lets you bridge the gap between third-party VR apps and scientific-grade data collection.