Why Academic Researchers Choose Vizard over Unity and Unreal

June 27, 2023

While game engines like Unity and Unreal, with their general purpose-architecture, may look appealing at first glance, VR research relies on specific features such as precise timing control, data analytics, or low entry hurdles that only a specialized software platform can offer. Read on for a deeper look at what is unique about the Vizard VR software development platform for academic research.

Clearly, with over 20 years of experience in providing VR solutions to the academic research community at WorldViz, we are deeply committed to supporting and fostering the growth of this vibrant community. Our VR software development platform, Vizard, was designed specifically for academic researchers by researchers themselves. Moreover, its architecture is thoroughly optimized for precise timing control and extensive accessibility, accommodating a wide array of students with varying skill sets - even those with no background in programming.

Recently, it was feedback from our customers that inspired us to revisit the topic of what makes Vizard unique. Many of our customers initially experimented with game engines, but later came to value the specific features that Vizard offers for their VR research. Let’s go over these in this article.

Specialized Tool versus General Purpose

The Vizard VR toolkit and game engines such as Unity and Unreal are both formidable development platforms, but they are designed for different applications and target audiences. Unity and Unreal are versatile game engines known for their adaptability and wide-ranging applications, from indie games to AAA titles, as well as VR/AR experiences. Conversely, Vizard is a specialized VR development platform, acclaimed for its comprehensive set of features catering to researchers in psychology, neuroscience, movement sciences, kinesiology, psychophysics, engineering, construction, and other fields.

A primary distinction between Vizard and Unity lies in their objectives. Unity is multipurpose, intended for crafting a diverse array of 3D and 2D games and simulations. It offers potent tools for constructing intricate environments, developing game mechanics, animating characters, and much more. Although Unity can be utilized for creating psychology experiments, it isn't specifically designed for this purpose, so researchers may find themselves having to develop numerous features from scratch or modify them from existing plugins.

In contrast, Vizard is fine-tuned specifically for creating VR experiences for research, particularly in the realms of psychology, movement sciences, and psychophysics. Its features and tools are devised to ensure precise control, measurement, and reproduction of experimental conditions, which are indispensable in these domains.

Precision Timing Control

Precision timing is essential for a wide range of experiments, particularly in the fields of psychology, movement sciences, and psychophysics. Variations in timing can introduce undesired fluctuations or bias in the results. Vizard, crafted specifically for scientific research, provides precise, millisecond-level timing for all events within an immersive virtual environment. This precision is critical, especially in experiments that involve the rapid presentation of stimuli or require accurate measurement of response times. Our customers frequently express the ease and confidence with which they can maintain such exacting timing control in Vizard. This is partly due to Vizard's transparent and well-structured methods, which contribute to consistent outcomes. Our users find this high level of precision in timing control and the accessibility of relevant data parameters invaluable, compared to alternative solutions where achieving such control is often more cumbersome or uncertain.

Professor Mark Carpenter, School of Kinesiology, University of British Columbia:

My VR research in balance and neural control of human movement requires that we have precise, milli-second-level control of event timing that is constant, both within and across sessions, to ensure we obtain reliable data from the VR stimuli we present. Vizard is the only VR software I know that provides this level of reliable timing control and precision, that allows us to relate VR events to our simultaneous neurophysiological, biomechanical, and perceptual recordings of human performance within the real-environment. On top of that, Vizard provides a low entry point for scripting skills to my undergraduate and graduate students, enabling them to quickly design experiments from the many sample scenes that Vizard comes with, and then go through rapid experiment iterations with different hardware configurations without the help of a programmer.

Ease of Use

Vizard has a proven low barrier to entry for creating experiments and collecting data. Our customers have found that even their undergraduate students with no background in scripting can put a Vizard experiment together in a very short period of time, leveraging many of Vizard’s and SightLab's out-of-the box tools and features. They can then collect meaningful data leading to publications. As a Python based scripting tool, Vizard is instantly familiar to many researchers and also uniquely compatible with AI assisted programming tools such as ChatGPT. Watch our video "How to Set Up and Experiment in 15 Minutes with Vizard and Python" to get an idea of Vizard's ease of use and rapid application development features.

Assistant Professor Omer Daglar Tanrikulu, Psychology, University of New Hampshire:

"I purchased Vizard and SightLab a year ago, and gave it to my undergraduate students for their research projects. Not only were they able to quickly create and run VR projects, but they produced meaningful research results."

Prebuilt Modular Example Scripts

Vizard comes with a huge library of example scenes and sample scripts ideal as building blocks for experiments. SightLab VR, a simple yet powerful extension to Vizard, offers a complete experiment structure with integrated data collection for eye tracking and a drag-and-drop menu for configuring VR experiments while retaining access to the scripting level. Take a look at this introduction to SightLab VR and our libraries of dozens of detailed example scripts for Vizard and SightLab.

Experiment Design Tools

Vizard includes a set of tools and libraries specifically designed for creating and managing experimental protocols. This includes features for managing trial sequences, randomizing or counterbalancing conditions, recording participant responses, managing data files, and more. These features can save a significant amount of time compared to building these capabilities from scratch in Unity or Unreal, and they can help ensure your experiment runs smoothly and accurately.

Native Python Scripting

Python reigns as the premier programming language in scientific and academic research, owed largely to its elegance, readability, and the abundant array of scientific libraries at one’s disposal. Vizard astutely employs Python as its scripting language, which, in many instances, streamlines the process for researchers embarking on programming their VR experiments by leveraging a language they are already acquainted with. This familiarity is an invaluable asset, as it allows researchers to dive into their work with greater ease and confidence.

Conversely, Unity employs C# as its principal language. Though C# is indisputably powerful and versatile, it can pose a more daunting learning curve for individuals who lack a background in programming. This distinction is critical to consider, as it means that researchers might face additional challenges in mastering C# compared to the more accessible Python used in Vizard. For those eager to dive into VR development without the hurdle of a steep learning curve, Vizard's choice of Python could be a decisive advantage.

AI Enabled Development

As a platform based on Python, Vizard is well-positioned for AI-enabled development. Language-based AI tools such as ChatGPT can generate Python code from verbal descriptions, and this code can be directly incorporated into Vizard. This compatibility makes it straightforward to integrate AI functionalities into VR projects, efficiently bridging the gap between artificial intelligence and virtual reality development in a practical manner and enabling an individual researcher to become more productive. 

With partially GUI based platforms like Unity, the AI can give suggestions for what the user should do with the GUI but at this point AI tools cannot implement GUI inputs on their own.  This means that in Vizard, you can use AI language models to generate your actual code and experiment without further intervention since Vizard features a completely Python-based IDE. Furthermore, you can now use AI to generate ideas for VR experiments, or find and create image, video, and 3D object resources using text prompts making development even quicker. Read more about this topic in our blog post “How to leverage AI for creating VR experiments”.

Leveraging Python Libraries

Vizard includes a built-in package manager that allows you to quickly and easily extend Vizard with Python libraries. This allows you to add popular and powerful libraries such as NumPy, Pandas, Matplotlib, SciPy, Tensorflow, and more. With various Python libraries installed you can build advanced data analytics directly into your application, construct more sophisticated GUI layers for end users, read and interpret conditions and settings out of a CSV or Excel file, or utilize machine learning models (to name just a few).

Hardware and Device Connectivity

Vizard stands out by offering native support for an ever-expanding array of VR hardware, including headsets, VR projection systems, motion capture systems, eye trackers, biofeedback, haptic devices, and more. This extensive hardware support, particularly for devices that are integral to research environments, empowers you to create a vast array of VR experiments with ease. The convenience of not having to scour the internet for specific plugins is a significant time-saver, allowing researchers to focus their energy on innovation and exploration within their VR experiments. Take a look at our list of natively supported devices here

Out of the box, Vizard features a GUI based tool for configuring different hardware set-ups called VizConnect. If you are using a specific type of VR headset such as a Vive or Oculus you just select that from the built in hardware selection menu and you are good to go. You can also easily configure or reconfigure VR scenes for different and complex hardware setups as your research progresses using the prebuilt configurator. That means you can bring together different hardware and sensor data like motion capture and biofeedback and eye tracking in one VR scenario quickly and easily through prebuilt settings. That also means you don’t have to rebuild your experiment two years from now when you want to use the latest hardware as you can also use the same VR application in different displays and in different configurations with minimal effort adapting from one hardware set up to the next. Unity also supports many VR devices, but Vizard's focus on research means it's more likely to support niche, older models and specialized devices commonly used in academic research. Unity, however, can be integrated with many devices through plugins and custom development but it might take some work pulling all the pieces together.

Vizard natively supports a broad variety of VR hardware

Eye Tracking Support

Headset enabled eye tracking is gaining traction as an accessible technique in psychological research and human factors, aiding in the analysis of attention, cognitive processes, and user engagement. Vizard excels by natively supporting a diverse array of VR headsets with integrated eye-tracking devices. Integrating eye tracking into experiments using Vizard is easy and just requires a few lines of script. Adding to Vizard’s robust native eye tracking capabilities, our SightLab VR Pro software serves as an intuitive yet powerful extension. It enables you to design and deploy virtual reality eye tracking experiments using a GUI configurator, with the added benefit of access to Vizard's underlying source code for effortless customization and tweaking. In contrast, while Unity does offer integration possibilities with eye trackers, it typically necessitates additional plugins or bespoke development, which can be more time-consuming and less streamlined compared to Vizard's solution.

Biometric Device Integration

Vizard provides support for integrating a wide range of biometric devices, such as EEG (electroencephalography) machines for measuring brain activity, GSR (galvanic skin response) devices for measuring emotional arousal, and heart rate monitors for tracking physiological arousal. This allows you to incorporate physiological measurements into your research, providing another layer of data to help understand user responses. You can even set flags and triggers that start events in a VR scene based on physiological conditions, such as lowering a raised platform when a user’s heart rate exceeds a certain threshold or setting a virtual ball to expand and contract in real time with a user’s respiration. By default Unity does not provide native support for these devices, requiring custom integrations.

A Vizard application in action at Mass General and Martinos Center, incorporating non-invasive neuroimaging inside a WorldViz VR projection system for medical research.

Data Analytics

Vizard is optimized for data collection giving you built-in access to raw data outputs for all your devices and sensors as well as events and interactions within the immersive virtual environment. Collecting a subject’s physiological, motion tracking, and eye tracking data, as well as recording the raw data of interactions within the immersive virtual environment is made easy and flexible in Vizard.

Some common data collection scenarios include:

  • Head position and rotation
  • Objects and areas of interest
  • Hand / controller position and rotation
  • Events within a VR environment (e.g. picking up an object etc.)
  • Coordinate data for eye position and pupil dilation (if using an eye tracker)
  • Interpersonal distance between two users or an avatar and an avatar
  • Much much more!

Professional Support and Custom Development

Last but not least - Vizard is backed up by a dedicated support staff who have decades of experience working with leading academic institutions to achieve their VR research goals. Vizard comes with multiple support options giving users direct email and phone access to the WorldViz team of experts including on staff scientists, programmers and 3D artists. Our team is standing by to help troubleshoot complex problems and provide guidance informed by 20+ years to ensure you can build and run studies quickly, efficiently and confidently.

And, if you want to skip the “building” part, we also offer custom development services where our in-house studio can build any experiment you can imagine. We’ve delivered research applications for the NIH, Stanford, University of California and University of Houston to name just a few.


In conclusion, the choice between Vizard and a game engine should be guided by the specific needs of your project. For research-focused VR applications that require precise control, extensive data collection capabilities, and integrated support for scientific hardware, Vizard has clear advantages. However there are many reasons why a researcher might choose to use Unity or Unreal instead, including access to programming resources who are already familiar with game development or budget restrictions requiring the use of a free to use or low cost software.

Interested in Vizard and SightLab VR? Contact us at sales@worldviz.com for quotes or trial licenses.

Resources and Additional Reading

Vizard Software

SightLab VR Software

SightLab is an extension to Vizard that gives you a GUI based starting point for eye tracking enabled VR experiments and modular Vizard scripts:

Additional Reference Material & Publications

Hardware & Software Guides

Stay Updated
Subscribe to our monthly Newsletter
Phone +1 (888) 841-3416
Fax +1 (866) 226-7529
813 Reddick St
Santa Barbara, CA 93103