The official banner for the IEEE Conference on Virtual Reality + User Interfaces, comprised of a Kiwi wearing a VR headset overlaid on an image of Mount Cook and a braided river.
Research Demos
Distant Hand Interaction Framework in Augmented Reality
Mid-air Haptic Texture Exploration in VR
We Are Oulu: Exploring Situated Empathy through a Communal Virtual Reality Experience
Asymmetric interfaces with stylus and gesture for VR sketching
Pixel Processor Arrays For Low Latency Gaze Estimation
Aroaro - A Tool for Distributed Immersive Mixed Reality Visualization
3DColAR: Exploring 3D Colour Selection and Surface Painting for Head Worn AR using Hand Gestures
B-Handy: An Augmented Reality System for Biomechanical Measurement
ORUN - A Virtual reality serious-game for kinematics learning
Demonstrating Immersive Gesture Exploration with GestureExplorer
NUX IVE - a research tool for comparing voice user interface and graphical user interface in VR
Feeding the fish: Interaction design to support listening to accounts of marginalization
Intelligence Visualization for Wave Energy Power Generation
Liquid digital twins based on magnetic fluid toy

Distant Hand Interaction Framework in Augmented Reality

Jesus Ugarte, Nahal Norouzi, Austin Erickson, Gerd Bruder, Greg Welch

Booth: C27 - Expo Hall A

Teaser Video: Watch Now

Recent augmented reality (AR) head-mounted displays support shared multi-user experiences, with past research studying the enhancement of interpersonal communication cues. However, less is known about distant interaction in AR and, in particular, distant communication. In this demonstration, we present a research framework for distant hand interaction in AR, including different techniques for hand communication, such as distant communication through symbolic hand gestures and distant drawing.

Mid-air Haptic Texture Exploration in VR

Orestis Georgiou, Jonatan Martinez, Abdenaceur Abdouni, Adam Harwood

Booth: C32 - Expo Hall A

Teaser Video: Watch Now

Mid-air haptic feedback has traditionally been used to enhance gesture input interactions. Here we present a VR research demo that expands such interactions to include for active haptic exploration. In the demo, the user can explore a virtual object using both hands and feel its intrinsic properties such as consistency and texture. The paper describes the physical apparatus used, the haptic rendering techniques leveraged, and the demo's relevance to applications such as VR shopping.

We Are Oulu: Exploring Situated Empathy through a Communal Virtual Reality Experience

Mohammad Sina Kiarostami, Aku Visuri, Simo Hosio

Booth: C26 - Expo Hall A

Teaser Video: Watch Now

In this research, we explore and measure situated empathy. We focus on the hardships of an international community in a foreign country using a virtual reality experiment. Our aim is to facilitate a better understanding of an international community's quality of life and unique difficulties in a society. To this end, we designed a VR experiment with three main stages: data collection, a pre-experiment questionnaire and a post-experiment questionnaire with a concluding interview.

Asymmetric interfaces with stylus and gesture for VR sketching

Qianyuan Zou, Huidong Bai, Lei Gao, Allan Fowler, Mark Billinghurst

Booth: C33 - Expo Hall A

Teaser Video: Watch Now

Virtual Reality (VR) can be used for design and artistic applications. However, traditional symmetrical input devices are not specifically designed as creative tools and may not fully meet artist needs. In this demonstration, we present a variety of tool-based asymmetric VR interfaces to assist artists to create artwork with better performance and easier effort. We conducted a pilot study showing that most users prefer to create art with different tools in both hands.

Pixel Processor Arrays For Low Latency Gaze Estimation

Laurie Bose, Piotr Dudek, Stephen Carey, Jianing Chen

Booth: C25 - Expo Hall A

Teaser Video: Watch Now

We demonstrate gaze tracking at over 10,000 Hz, with processing latency below 0.1 ms via use of a Pixel Processor Array (PPA) vision sensor. The PPA allows visual data to be processed efficiently at the point of light capture. By extracting features used for gaze tracking upon the PPA, we reduce data transfer from sensor to processing from entire images to a hand-full of contextual bytes, saving significant power, time and allowing for frame-rates far exceeding traditional camera sensors.

Aroaro - A Tool for Distributed Immersive Mixed Reality Visualization

Fernando Beltrán, David C White, Jing Geng

Booth: C34 - Expo Hall A

In this research demo we present three immersive scenarios on three XR modalities - VR, immersive AR on HoloLens and 2D AR on Android. These scenarios are a network of Harry Potter characters in VR, a map-based visualization of a soldier's history with rich attributes including images and sound on HoloLens, and a visualization of car racing on Android. These visualizations have been created with Aroaro our distributed mixed reality data visualization tool.

3DColAR: Exploring 3D Colour Selection and Surface Painting for Head Worn AR using Hand Gestures

Louise M Lawrence. Gun Lee, Mark Billinghurst, Damien Rompapas

Booth: TBD -

Teaser Video: Watch Now

Color selection and surface painting has been largely unexplored in head-worn AR using hand gestures. We present a system with two key approaches for painting a virtual 3D model using mid- air hand gestures. This includes a virtual pen which the user can grasp using their hand, and the use of the user's fingertip directly. We hope to explore how the various techniques effect users when performing surface painting of virtual objects using mid-air hand gestures via. several user studies.

B-Handy: An Augmented Reality System for Biomechanical Measurement

James O Campbell, Alvaro Cassinelli, Daniel Saakes, Damien Rompapas

Booth: TBD -

Teaser Video: Watch Now

The study of Bio-mechanics allows us to infer measurements without measuring tools. A limitation comes from the complex mental transformations of space involved. The efficiency of this task degrades the larger these measurements become. We present a system that offloads this mental workload by providing visual transformations of space in the form of tracking and duplicating the user's hand in AR.

ORUN - A Virtual reality serious-game for kinematics learning

Jhasmani Tito, Regina Moraes, Tania Basso

Booth: TBD -

Teaser Video: Watch Now

Virtual Reality is one of the new educational technologies that has an imperative importance due to the possibilities that it offers, such as bringing hands-on experiences for learning physics phenomena. This demo is a serious game, based on VR, and focused on learning of specific concepts of kinematics. The game is intended to deliver an immersive experience in which the student has an active role and whose game design includes theoretical concepts to maintain engagement throughout the tasks.

Demonstrating Immersive Gesture Exploration with GestureExplorer

Ang Li, Jiazhou Liu, Maxime Cordeil, Barrett Ens

Booth: TBD -

Teaser Video: Watch Now

We demonstrate GestureExplorer, which features versatile immersive visualisations to grant the user free control over their perspective, allowing them to gain a better understanding of gestures. It provides multiple data visualisation views, and interactive features to support analysis and exploration of gesture data sets. This demonstration shows the potential of GestureExplorer for providing a useful and engaging experience for exploring gesture data.

NUX IVE - a research tool for comparing voice user interface and graphical user interface in VR

Karolina Buchta, Piotr Wójcik, Mateusz Pelc, Agnieszka Górowska, Duarte Mota, Kostiantyn Boichenko, Konrad Nakonieczny, Krzysztof Wrona, Marta Szymczyk, Tymoteusz Czuchnowski, Justyna Janicka, Damian Ga?uszka, Rados?aw Sterna, Magdalena Igras-Cybulska

Booth: TBD -

Teaser Video: Watch Now

We introduce a new IVE designed to compare user interaction between the mode with traditional graphical user interface (GUI) with the mode in which every element of interface is replaced by voice user interface (VUI). In each version, 4 scenarios of interaction with a virtual assistant in a sci-fi location are implemented, each of them lasting several minutes. The IVE is supplemented with tools for automatic generating reports on user behavior (clicktracking, audiotracking and eyetracking).

Feeding the fish: Interaction design to support listening to accounts of marginalization

Dylan Paré, John Craig, Scout Windsor

Booth: TBD -

Teaser Video: Watch Now

We present a study using a virtual reality experience designed to deepen understanding of gender and sexuality-based marginalization in STEM fields as complex experiences with individual and systemic dimensions. Our design supports learners to remain engaged in listening to and learning about the marginalized other. Our analysis describes how participants reflect upon and use the interactions to move through the difficulties of listening to another's stories of marginalization.

Intelligence Visualization for Wave Energy Power Generation

Xiaocheng Liu, Yuqi Liu, Jinkang Guo, Ranran Lou, Zhihan Lv

Booth: TBD -

Teaser Video: Watch Now

Ocean waves provide a large amount of renewable energy, and Wave energy converter (WEC) can convert wave energy into electric energy with the linear motion of waves. This paper proposes a visualization platform for wave power generation. The platform can intelligently allocate power generation equipment based on the power generation forecast data to achieve precise matching of power generation and power consumption, thereby improving overall power generation efficiency.

Liquid digital twins based on magnetic fluid toy

Yuqi Liu, Zengxu Bian, Xiaocheng Liu, Zhihan Lv

Booth: TBD -

Teaser Video: Watch Now

As a new type of functional material,magnetic fluid has both the fluidity of liquid and the magnetic properties of solid magnetic material. By controlling the magnets,one can simulate the effect of manipulating liquids like a sea emperor.This will provide new ideas for the multiverse of the metaverse. Therefore,this paper hopes to provide a control idea for the future application of magnetic fluid by performing Digital Twins simulation of magnetic fluid.

Conference Sponsors

Diamond

Virbela Logo. Their name next to a stylised green, red, and blue circle.

Gold

ChristchurchNZ Logo. Their name is written in a red font.

iLRN Logo. Their name is written in an orange font.
University of Canterbury Logo. Their name is written in a red font.

Silver

The Qualcomm logo. Their name is written in a blue font.

Bronze

HITLab NZ logo. A Kiwi wearing a VR headset.

Supporters

ARIVE logo. Their name is written next to picture of Australia and New Zealand.

Multimodal Technologies and Interaction logo. Their name is written next to a stylised blue M.

NVIDIA's logo. Their name is written under a green and white spiral with a green rectangle covering its right half.

Pico's logo.

XR bootcamp logo.

Doctoral Consortium Sponsors

The National Science Foundation's logo. NSF is written in white over a globe surrounding by silhouettes of people holding hands.

Conference Partner


Code of Conduct

© IEEEVR Conference