How to Screencast Your Desktop into VR with SightLab VR Pro

July 19, 2025

Run Any Desktop Application in Immersive VR with Eye Tracking



SightLab VR enables you to capture and display real-time content from any application or browser window directly inside your VR/AR environment. This is a powerful feature for research, teaching, and mixed reality presentations - especially if you want to run standard desktop experiments (like PsychoPy) in a virtual setting while leveraging VR and eye tracking.

Watch this example video on screencasting in SightLab VR Pro.


What Can You Do With Virtual Screencasting?

  • Display live apps (PsychoPy, E-Prime, Open Sesame, PowerPoint, browsers) inside your VR/AR scene
  • Run standard visual, cognitive, or survey tasks in VR—no rewriting required
  • Record sessions for replay and analysis (with heatmaps, fixations, gaze overlays)
  • Capture and interact with any window or the whole desktop
  • Scale, reposition, and rotate the virtual screen in VR
  • Enable pass-through for mixed reality setups
  • Record video automatically for each trial (AVI/MP4)
  • Project smartphone screens into VR for mobile-related experiments
  • Project multiple screens from your desktop into an immersive space
  • Share collaborative spaces and view content together in Multi-User

Quick Start: Setup Requirements

Python Libraries (install via Vizard IDE > Tools > Package Manager):

pip install opencv-python pillow pywin32

If casting from a browser:
Disable hardware acceleration for best compatibility (especially with Chrome):

  • Open chrome://settings
  • Scroll to “System”
  • Toggle off “Use hardware acceleration when available”
  • Restart Chrome


How To: Virtual Screencasting in SightLab

1. Select the Window to Capture

You can capture any window interactively or specify by title from a dropdown of all open applications:

screen_capture = WindowCapture(ask_user=True, flip_code=1, record_screen=True)

Or to select a specific window:

screen_capture = WindowCapture(window_title="PsychoPy", flip_code=1, record_screen=True)

2. Add a Virtual Screen & Link the Texture 

This is already added in the example environment

screen = viz.add('video_screen_object/3D_Video_Screen.osgb')

screen.texture(screen_capture.get_texture())

You can also use a custom 3D object (e.g., monitor, phone) for display.

3. (Optional) Enable Pass-through for XR

passthrough.setEnabled(True)

viz.clearcolor(viz.BLACK, 0.0)

4. Control the Screen’s Size and Position

Best method: Open the screen object in the 3D model Inspector and scale as desired, or use keyboard controls in the script:

vizact.onkeydown('m', scaleVideoUp)

vizact.onkeydown('n', scaleVideoDown)

5. Recording Per Trial

If record_screen=True, SightLab automatically records video for each trial:

screen_capture.start_recording()

screen_capture.stop_recording()

  • Files are saved as <timestamp>_<participantID>_experiment_data_trial_<trialNumber>.mp4
  • This is useful for the replay to be able to view heatmaps, fixations on top of the content that was viewed.


Example: Casting a Mobile Device Screen


Want to include mobile phone tasks? Mirror your phone screen and capture it into VR:

Android: Use scrcpy (USB debugging on, run scrcpy on PC)
screen_capture = WindowCapture(window_title="scrcpy", flip_code=1, record_screen=True)

iPhone: Use LonelyScreen for AirPlay mirroring
screen_capture = WindowCapture(window_title="LonelyScreen", flip_code=1, record_screen=True)

Attach the video texture to a phone-shaped model for immersion:
phone_model = vizfx.addChild('Resources/Virtual_Screencast/smartphone.osgb')

phone_model.texture(screen_capture.get_texture())

phone_model.setPosition([0, 1.5, 1.2])


Replay and Analysis

All screencast sessions are automatically recorded and can be reviewed with SightLab’s replay tools:

  • Heatmaps
  • Scan paths
  • Gaze point overlays
  • Fixation spheres

This makes it easy to visualize and analyze how participants interact with content—whether it’s a video, web page, or experiment interface.


Use Cases

  • Run PsychoPy, E-Prime, OpenSesame, or other legacy experiments in VR—collect gaze/behavioral data without changing your code
  • Present slides, videos, or live content to users in an immersive environment
  • Conduct VR-based surveys or focus groups with any existing survey software
  • Simulate real-world device use (phones, tablets) in VR


Tips & Best Practices

  • Adjust and position the screen object for optimal ergonomics in VR
  • For multi-user sessions, use a shared screen via apps like Zoom or Google Meet
  • For mixed reality, enable pass-through for real-world context
  • All recordings are auto-saved and accessible via SightLab’s replay system
  • Using the full desktop casting will slow down frame rate if your desktop resolution is high.
  • Casting a specific window that is resized to smaller than your desktop is preferred for performance.


Example Script Location

  • Look for ExampleScripts/Virtual_Screencast/ and Multi User ExamplesScripts\Virtual_Screencast for ready-to-use code templates.
    • Examples include single screen, multiple screens, phone casting and more


Further Reading


With SightLab’s screencasting, you can bridge traditional behavioral experiments with the power of VR, enabling advanced gaze analytics, realistic immersion, and seamless integration of your existing tools.

For more information about how you can incorporate Worldviz VR software tools such as SightLab into your research or setting up full VR Labs and consultation contact sales@worldviz.com 

To request a demo of SightLab click here

Stay Updated
Subscribe to our monthly Newsletter
CONTACT US 
Phone +1 (888) 841-3416
Fax +1 (866) 226-7529
813 Reddick St
Santa Barbara, CA 93103