Tutorials

Saturday, 19th March 2016

8:30 am – 5:00 pm, Magnolia
Towards Reusable VR Frameworks: an introduction using Simulator X

Dennis Wiebusch, Marc Erich Latoschik
Abstract: In the context of VR research, the creation of disposable software is common practice. After an application has been used in a study or to demonstrate research results, it often is archived and entirely new software is created in future research projects. VR toolkits facilitate this process, but manually implemented features of the old software often cannot be easily reused and have to be reimplemented in large part. This tutorial aims at revealing reasons for this situation and at proposing ways to facilitate reuse.


8:30 am – 12:15 pm, Gardenia
Human-Centered Design for Immersive Interactions

Jason Jerald
Abstract: VR has the potential to provide experiences and deliver results that cannot be otherwise achieved. However, interacting with immersive applications is not always straightforward and it is not just about an interface for the user to reach their goals. It is also about users working in an intuitive manner that is a pleasurable experience and devoid of frustration. Although VR systems and applications are incredibly complex, it is up to designers to take on the challenge of having the VR application effectively communicate to users how the virtual world and its tools work so that those users can achieve their goals in an elegant manner.


1:45 pm – 5:00 pm, Gardenia
Structural Equation Modeling for Human-Subject Experiments in Virtual and Augmented Reality

Bart P. Knijnenburg
Abstract: User experiments are an essential tool to evaluate the user experience of AR/VR systems. This tutorial teaches state-of-the-art methods to statistically evaluate the outcomes of such experiments using psychometric measurement scales, factor analysis, and structural equation modeling. The R package “lavaan” will be used to conduct example analyses. The tutorial is designed for participants who have attended the Swan & Gabbard tutorial in previous years, or who have taken an introductory methods course at their home institution.



Sunday, 20th March 2016

8:30 am – 12:15 pm, Magnolia
Optical See-Through AR Calibration: Methods for Current & Next Generation Head-Mounted Displays

Kenneth R Moser
Abstract: With the current advent of consumer level optical see-through head-mounted displays, such as the Epson Moverio and Google Glass, and the release of next generation technology, including the Microsoft HoloLens and Epson Pro BT-2000, on the horizon, the future of augmented reality is poised for an explosion of new applications targeting the general populace. This domain is, likewise, producing a growing need for intuitive and easily implemented calibration procedures accessible to researchers, developers, and novice users alike. The promise of integrated depth sensors in future display re-leases greatly enhances the possibilities for robust stereo calibration mechanisms tailored for consumer devices. This half day tutorial will provide attendees, of all skill levels, the required knowledge and techniques to effectively implement calibration procedures in their own optical see-through augmented reality systems. Participants from the mixed and virtual reality communities will find these techniques beneficial to their applications as well.


8:30 am – 12:15 pm, Gardenia
VR and AR best practices for games and applications using the Unity Engine

Arturo Núñez
Abstract: Tools for Augmented Reality and Virtual Reality development are now accessible for a wide range of developers. Free graphics engines, affordable hardware and familiar APIs to construct software are here. However, the knowledge to create optimized, high quality experiences is something that is still being worked on. Using the Unity game engine we will explore best practices for development (FPS, experience, input, UI).


1:45 pm – 5:30 pm, Magnolia
Eye Tracking in Desktop VR: Data Synchronization, Capture, Visualization, and Analysis

A. T. Duchowski, J. Bertrand, M. Volonte
Abstract: Eye tracking allows measurement of one’s overt visual attention, in particular where, when, how long, and in what order areas of the Virtual Reality (VR) display were fixated. Eye tracking has become a popular method for investigating research related to human cognitive processes, especially visual attention. In VR, eye tracking can produce insights into how humans interact with tools, virtual space, and/or virtual agents. These insights can be turned into guidelines for interaction design and/or more effective virtual environments. Including eye tracking methodology into one’s research in an efficient and effective way, however, requires a variety of technical capabilities and know-how [6]. These specifically include knowledge of the physical and cognitive background of human visual processing, technical skills to cope with large amounts of eye tracking data, statistical methods to interpret the data in a meaningful way, as well as competences in designing an empirical eye tracking experiment. Of particular interest is connecting a table-mounted eye tracker to a Virtual Reality application such as Unity 3D.