The official banner for the IEEE Conference on Virtual Reality + User Interfaces, comprised of a Kiwi wearing a VR headset overlaid on an image of Mount Cook and a braided river.

Tutorials

Sunday, March 13, NZDT, UTC+13
Build Your Own Social Virtual Reality With Ubiq Part 1: 8:00 - 9:30
Part 2: 10:00 - 11:30
Emotion and Touch in Virtual Reality 12:00 - 13:30

Tutorial: Remote Collaboration using Augmented Reality: Development and Evaluation

Saturday, March 12, 8:00 - 9:30, NZDT UTC+13

Organisers: Bernardo Marques, Samuel Silva, Paulo Dias, Beatriz Sousa Santos - University of Aveiro, Portugal

Discord URL: https://discord.com/channels/842181663248482334/951007684335898635

Collaboration is essential in industrial, medical, and educational contexts, among others. It can be described as the process of interdependent activities performed to achieve a common goal between co-located or remote collaborators. One major issue of remote collaboration is the fact that collaborators don't share a common space/world. In this vein, Augmented Reality (AR) solutions can be powerful tools to establish a common ground, allowing support complex problems and situations. Such solutions can lead to new insights, innovative ideas, and interesting artefacts.

This tutorial will present essential concepts associated with collaboration and AR technologies from a human-centered perspective putting an emphasis on what characterizes the collaborative effort, e.g., team, time, task, communication, scene capture, shared context sources, user actuation, among others, as well as what people need to maximize the remote collaboration effort. This established the grounds on which to assess and evolve Collaborative AR technologies.

Afterward, the maturity of the field and a roadmap of important research actions, that may help address how to improve the characterization and evaluation of the collaboration process will be discussed. This is extremely important since current literature reports that most research efforts have been devoted to creating the enabling technology and overcoming engineering hurdles.

Besides, most studies rely on single-user methods, which are not suitable for collaborative solutions, falling short of retrieving the necessary amount of data for more comprehensive evaluations. This suggests minimal support of existing frameworks and a lack of theories and guidelines. With the growing number of prototypes, the path to achieve usable, realistic and impactful solutions must entail an explicit understanding regarding how collaboration occurs through AR and how it may help contribute to a more effective work effort.

The tutorial will end with a call for action. The evaluation process must move beyond a simple assessment of how the technology works. Conducting thorough evaluations is paramount to retrieve the necessary data and obtain a comprehensive perspective on different factors of Collaborative AR: how teams work together, how communication happens, how AR is used to create a common ground, among others. By having a grasp on these (typically not reported in literature), it may be possible to better define how research should progress and how the tools can evolve to improve the collaborative effort.

Tutorial: Developing Situated Analytics Applications with RagRug

Saturday, March 12, 10:00 - 11:30, NZDT UTC+13

Organisers: Dieter Schmalstieg, Philipp Fleck - Graz University of Technology, Austria

Discord URL: https://discord.com/channels/842181663248482334/951008175128182785

RagRug is the first open-source toolkit dedicated to situated analytics. The abilities of RagRug go beyond previous immersive analytics toolkits by focusing on specific requirement emerging when using augmented reality rather than virtual reality. RagRug lets users create visualizations that are (a) embedded with referents (specific physical objecs in the environment) and (b) reactive to changes in the real world (both physical changes and changes in the data related to the referents). These capabilities are enabled by an easy-to-learn programming model ("event-driven functional reactive programming") on top of the Unity game engine, the Node-RED framework for internet of things, and the Javascript programming language. RagRag ensures these tried-and-tested components work seamlessly together and delivers visualizations that are both expressive and easy to use. It is important to note that RagRug does not break new ground in terms of the visualizations it can create; instead, it breaks new ground in how it integrates visualizations with referents. This ability comes from RagRug's support for modeling both spatial and semantic properties of referents, and for its support of IoT sensors.

The modeling can be performed using a variety tools, such as CAD modeling or 3D scanning. The results are placed in one or more database back-ends, in such a way that an AR client application can query relevant data using the user's current location or task description to formulate a meaningful query and retrieve relevant data on the fly without prior configuration of the AR client.

The visualization capabilities of RagRug build on the state of the art in immersive analytics, but it extends it towards allowing real-time reactions to data streaming from sensors observing changes in the environment. If new data comes in from the sensors, the situated visualization changes automatically. Programmers do not have to worry about the "how", the can concentrate on the "what" of situated visualization.

Tutorial: Empathy-enabled Extended Reality

Saturday, March 12, 12:00 - 13:30, NZDT UTC+13

Organisers: Denis Gračanin - Virginia Tech, USA

Discord URL: https://discord.com/channels/842181663248482334/951008544818335764

Empathy is defined as an ability to understand and share others' feelings, a critical part of meaningful social interactions. There are different types of empathy, such as cognitive, emotional and compassionate empathy. A major claim about Extended Reality (XR) is that it can foster empathy and elicit empathetic responses through digital simulations. The availability of portable and affordable bio-sensors (especially contactless) makes it feasible to measure, in real-time, physiological and other signals while using XR. These measurements (such as heart rate, breathing rate, facial expressions, electro-dermal activity, EEG, and others) can inform, in real-time, about user's cognitive and emotional state and enable empathetic responses and measurements. This information can be used both to evaluate the impact of XR content on the user and to adapt XR content based on the user's state. The goal of this tutorial is to introduce the participants to the concept of empathy and its use in XR applications. The participants with learn how to incorporate contextualized empathy data, information, and services into XR applications and use that to improve user experience. The use of empathy will be presented from two points of view. First, we will view XR as an 'empathy machine' that elicits empathetic responses in users. Second, we will view XR as an 'empathetic entity' that empathizes with users and adjust XR user experience accordingly. Empathy-enabled XR combines these two views and provides a two-way empathy link between users and XR. Several example applications from healthcare and education domains will be presented.

Tutorial: Build Your Own Social Virtual Reality With Ubiq

Part 1: Sunday, March 13, 8:00 - 9:30, NZDT UTC+13
Part 2: Sunday, March 13, 10:00 - 11:30, NZDT UTC+13

Organisers: Anthony Steed, Sebastian Friston, Ben Congdon - University College, London, UK

Discord URL: https://discord.com/channels/842181663248482334/951008715941752873

One of the most promising applications of consumer virtual reality technology is its use for remote collaboration. A very wide variety of social virtual reality (SVR) applications are now available; from competitive games amongst small numbers of players; through to conference-like setups supporting dozens of visitors. Indeed many participants at IEEE Virtual Reality 2022 will be experiencing at least some of the conference through a SVR application. The implementations strategies of different SVR applications are very diverse, with few standards or conventions to follow. There is an urgent need for researchers to be able to develop and deploy test systems so as to facilitate a range of research from new protocols and interaction techniques for SVRs through to multi-participant experiments on the impact of avatar appearance. This tutorial will explain the key concepts behind SVR software and introduce Ubiq, an open source (Apache licence) platform for developing your own SVR applications.

Tutorial: Emotion and Touch in Virtual Reality

Sunday, March 13, 12:00 - 13:30, NZDT UTC+13

Organisers: Darlene Barker, Haim Levkowitz - University of Massachusetts Lowell, USA

Discord URL: https://discord.com/channels/842181663248482334/951008848146206730

To make more impact on social interaction within virtual reality (VR), we need to consider the impact of emotions on our interpersonal communications and how we can express them within VR. This tutorial will show the introductory research on the topic, where we propose the use of emotions that are based upon the use of voice, facial expressions, and touch to create the emotional closeness and nonverbal intimacy needed in nonphysical interpersonal communication. Virtual and long-distance communications lack the physical contact that we have with in-person interaction as well as the nonverbal cues that enhance what the conversation is conveying. The use of haptic devices and tactile sensations can help with the delivery of touch between parties and machine learning can be used for emotion recognition based on data collected from other sensory devices; all working towards better long-distance communications.

In today's world, we are forced to remain apart due to the global pandemic and the safety measures needed to prevent its spread. So, having a better means of communicating with loved ones would be invaluable. Outside of the pandemic, the ability to experience touch as well as other senses within VR could help enhance communication between family who lives at a distance or family who is separated because some of them must travel for work. Those who are visually impaired may also benefit from such technology.

Conference Sponsors

Diamond

Virbela Logo. Their name next to a stylised green, red, and blue circle.

Gold

ChristchurchNZ Logo. Their name is written in a red font.

iLRN Logo. Their name is written in an orange font.
University of Canterbury Logo. Their name is written in a red font.

Silver

The Qualcomm logo. Their name is written in a blue font.

Bronze

HITLab NZ logo. A Kiwi wearing a VR headset.

Supporters

ARIVE logo. Their name is written next to picture of Australia and New Zealand.

Multimodal Technologies and Interaction logo. Their name is written next to a stylised blue M.

NVIDIA's logo. Their name is written under a green and white spiral with a green rectangle covering its right half.

Pico's logo.

XR bootcamp logo.

Doctoral Consortium Sponsors

The National Science Foundation's logo. NSF is written in white over a globe surrounding by silhouettes of people holding hands.

Conference Partner


Code of Conduct

© IEEEVR Conference