Over 100 Ready-to-Use Experiments and Templates for VR Research

September 3, 2025

The SightLab Example Scripts Library is a collection of ready-to-use templates and demonstrations bundled with SightLab VR Pro. It’s designed to help researchers, developers, and educators rapidly prototype, test, and extend VR experiments without starting from scratch.
‍

πŸ‘‰ The full library overview is available online here: SightLab Example Scripts Library.
‍

See also this video of some of the Example Scripts in action.

‍

‍

This library covers a wide range of use casesβ€”from basic trial setup to advanced AI-driven agents - making it a valuable resource whether you’re building standard psychology experiments, immersive training modules, or interactive educational sessions.

‍


‍

Why Use the Example Scripts?

‍

  • Jumpstart Experiment Design – Avoid reinventing the wheel by leveraging working templates.
  • Learn by Example – Scripts illustrate best practices for event handling, trial control, data logging, and visualization.
  • Extend Easily – Each script can be customized with your own 3D models, 360Β° media, sensors, or integrations.
  • Cover Advanced Use Cases – Beyond simple trials, scripts demonstrate adaptive learning, real-time AI agents, multi-user setups, physiological biofeedback, adding eye tracking to external apps, and more.

‍


‍

Categories of Example Scripts

The library spans dozens of categories. Here are some highlights:

‍

Core Interaction
‍

  • Gaze Based Interactions – Objects that react to user attention.
  • Grab Events – Track and log when objects are picked up or released.
  • 3D Tablet / VR Menu – Customizable 3D interface for scene interaction.
  • Hand Tracking & Full Body Avatars – Physics-based hand interactions, finger data, or tracked avatars.
  • Drawing in 3D – Freehand 3D drawing tools.
  • Proximity Sensors – Create location-based interaction zones.

‍

Data Collection & Visualization

‍

  • External Application Data Recorder – Add and record eye tracking + physiology to external VR apps such as Unity, Unreal, or PCVR games.
  • Heat Maps (Aggregated Data) – Aggregate gaze and fixation maps.
  • Barchart & Image Viewer – Generate charts, boxplots, and automate image sequences.

‍

‍

Environment & Content

‍

  • Virtual 2D Screens & Screencasting – Embed video, slideshows, or desktop feeds inside VR.
  • Mixed Reality / Augmented Reality – Blend real and virtual environments.
  • Moving ROIs in 360Β° Media – Track dynamic regions of interest in immersive video.

‍

Adaptive & AI-Powered

‍

  • AI Agents – Add conversational NPCs powered by GPT, Claude, Gemini, or offline LLMs.
  • Adaptive Learning / Staircase – Implement adaptive psychophysics or training paradigms.
  • AI Model & Image Spawners – Generate 3D models or images in real-time inside VR.

‍

‍

Sensor Integration

‍

  • Biofeedback – Integrate heart rate, cognitive load, or pupil diameter.
  • Eye Tracker Tests – Benchmark hardware performance.
  • Lab Streaming Layer – Synchronize with EEG, physiology, and neuro devices.
  • Face, Hand, and Foot Tracking – Capture and analyze detailed user input streams.

‍

Experimental Control

‍

  • Adding Instructions – Display instructions at trial start/end.
  • Adding a Rating Scale / Surveys – Collect subjective ratings or demographics.
  • Driving Simulations – Templates for seated or full-driving experiments with gaze and physiology tracking.

‍


‍

Experiment Templates (Turnkey Experiments)

SightLab also includes a set of ready-to-run experiment templates, designed to replicate common paradigms in psychology, HCI, and training research:
‍

  • Choice Comparison – Compare object-based decisions with multiple input methods.
  • Distance Perception – Assess depth perception accuracy.
  • Memory Task – Spatial memory example.
  • Phobia Simulations – Elevator, Public Speaking, Airplane, Arachnophobia (with Biopac integration).
  • Pigeon Hunt – Sound localization task.
  • Reaction Time – Baseline response time measurement.
  • Reaction Time (Shoot/Don’t Shoot) – Accuracy in threat identification.
  • Scripted Avatar Agent – Avatars that read scripts and collect data.
  • Spatial Scanning – Scanning and recognition task.
  • Visual Search Variants and Templates – Object Size, Shopping, conjunction vs. feature and more.Β 
  • Walk the Plank – Height exposure with Biopac and fixation capture.

‍


‍

Additional Demos

Beyond academic paradigms, SightLab ships with demo scenarios for training, simulation, and visualization:
‍

  • Driving 360 – Driving demo with 360Β° video and 3D overlays.
  • Box Stacker – Physics-based stacking demo.
  • Virtual Cockpit – Aviation cockpit training.
  • Apartment/Building Review – Walkthrough for real estate/architecture.
  • Digital Twin / Data Visualization – City-scale mapping with real-time data.
  • Fire Safety Training – Emergency response demo.
  • Head Anatomy – Interactive anatomy model.
  • Helicopter Rotor Disassembly – Mechanical disassembly demo.
  • Additional Tutorials – Links into the Vizard documentation.

‍


‍

SightLab VR Presentation Tool

A standout template, the SightLab VR Presentation Tool is designed for VR and AR -based presentations, education and training. It provides:
‍

  • Drag-and-Drop GUI Interface for quick session setup.
  • Multi-User Functionality for collaborative classrooms.
  • AI Agent Support for adaptive learning and tutoring.
  • Instructor-Led Sessions with live guidance tools.Β 
  • Content Management for presentations and structured training modules.
  • Data Collection and Visualization Save data from user interactions, eye tracking, physio and more

‍


‍

Customization Workflow

Each example script can be extended and customized:
‍

  1. Select a Template – Copy it into your project folder.
  2. Add Your Assets – Replace environments or models with your own.
  3. Add Data Collection – Enable gaze, grab, proximity, or survey events.
  4. Extend with Python Libraries – Leverage external Python modules for added analytics or integrations.

‍


‍

Learning Resources

‍

‍


‍

The SightLab Example Scripts Library is more than just a set of templates - it’s a launchpad for experimentation. Whether you’re running simple gaze-based studies or complex AI-driven adaptive learning simulations, the library gives you tested, modular building blocks to accelerate your workflow. By combining these with SightLab’s GUI editor or Python scripting interface, you can quickly design, run, and extend cutting-edge VR studies.

Stay Updated
Subscribe to our monthly Newsletter
CONTACT USΒ 
PhoneΒ +1 (888) 841-3416
Fax +1 (866) 226-7529
813 Reddick St
Santa Barbara, CA 93103