The official banner for the IEEE Conference on Virtual Reality + User Interfaces, comprised of a Kiwi wearing a VR headset overlaid on an image of Mount Cook and a braided river.
Research Demos
Prototyping Large Scale Projection Based Projects in VR
Enhancing Virtual Material Perception with Vibrotactile and Visual Cues
A Novel Piezo-Based Technology for Haptic Feedback for XR
Spatially Augmented Reality on Non-rigid Dynamic Surfaces
ARCam: A User-Defined Camera for AR Photographic Art Creation
DENTORACULUS-A gamified Virtual Reality experience for dental morphology learning
Bumpy Sliding: An MR System for Experiencing Sliding Down a Bumpy Cliff
VRdoGraphy: An Empathic VR Photography Experience
Add-on Occlusion: Building an Occlusion-capable Optical See-through Head-mounted Display with HoloLens 1
Dynamic Scene Adjustment for Player Engagement in VR Game
Facilitating Asymmetric Interaction between VR Users and External Users via Wearable Gesture-Based Interface
Bringing Instant Neural Graphics Primitives to lmmersive Virtual Reality

Prototyping Large Scale Projection Based Projects in VR

Clarice Hilton: University of Liverpool; Xueni Pan: Goldsmiths; Hankun Yu: none; Richard Koeck: University of Liverpool

Booth: Demo1 - Room D: CHANGAN

This research develops a technique to simulate large scale immersive installations in VR. The project is a collaboration with Aardman Animations, a world-renowned animation studio developing a projection-based installation. We developed a room-scale VR experience to simulate the installation to visualise the projection distortion under different conditions. A study was conducted to investigate the effect of distortion on the experience of immersion and how three factors effected distortion.

Enhancing Virtual Material Perception with Vibrotactile and Visual Cues

Antony Tang: Auckland Bioengineering Institute; Mark Billinghurst: Auckland Bioengineering Institute; Samuel Rosset: Auckland Bioengineering Institute; Iain A Anderson: Auckland Bioengineering Institute

Booth: Demo2 - Room D: CHANGAN

The ability to feel is crucial for a more realistic Virtual Reality (VR) experience. This research demo presents a way to enhance the VR experience through the use of a lightweight glove that provides vibrotactile feedback, allowing the user to feel the stiffness of virtual objects of different materials. This demonstration presents a novel combination of visual information and vibrotactile feedback that has been shown to significantly improve a user's engagement with a VR environment

A Novel Piezo-Based Technology for Haptic Feedback for XR

Rolf Simon Adelsberger: Sensoryx; Alberto Calatroni: Sensoryx; Salar Shahna: Sensoryx

Booth: Demo3 - Room D: CHANGAN

We present a novel technology for tactile haptic feedback tailored to Virtual Reality (VR) and Augmented Reality (AR) experiences. Unlike piezoelectric benders, eccentric rotating masses (ERM), and linear resonant actuators (LRA) we present an application of piezoelectric motors aimed at providing realistic tactile haptic feedback.

Spatially Augmented Reality on Non-rigid Dynamic Surfaces

Aditi Majumder: UCI; Muhammad Twaha Ibrahim: UC Irvine

Booth: Online - Room D: CHANGAN

We will demonstrate a spatially augmented reality system on non-rigid dynamic surfaces like stretchable fabrics. Using a single projector and RGB-D camera, our system adapts the projection to the changing shape of the non-rigid surface automatically. Such systems have applications in various domains such as art, design, entertainment and medicine. In particular, we we will demonstrate the use of the system as a surgical guidance system for cleft palate surgery using a Simulare cleft model.

ARCam: A User-Defined Camera for AR Photographic Art Creation

Xinyi Luo: University of Electronic Science and Technology of China; Zihao Zhu: The Hong Kong University of Science and Technology (Guangzhou); Yuyang Wang: Hong Kong University of Science and Technology; Pan Hui: The Hong Kong University of Science and Technology

Booth: Demo4 - Room D: CHANGAN

Photography in augmented reality can be challenging due to the restrictions of pre-defined settings. However, adjustable photography settings and real-time previews are significant for AR photographic creation as creators must adjust multiple camera properties to present unique visual effects. In this work, we designed an AR camera (ARCam) with various adjustable properties to give users a high degree of freedom for photographic art creation in real-time preview.

DENTORACULUS-A gamified Virtual Reality experience for dental morphology learning

Viky Cecilia Camarena Quispe: National University of Engineering; Julio César Cubas Aguinaga: National University of Engineering; Luis Gonzalo Díaz Huaco: National University of Engineering; Jhasmani Tito: Kusillo Games Studio

Booth: Demo5 - Room D: CHANGAN

Dentoraculus is an educational tool on VR for the teaching of dental morphology whose objective is to train dental students in a didactic and playful way, as well as to complement the teaching of complex theoretical concepts and laboratory training. To enhance the motivation, gamification and storytelling techniques were applied. 3D models of dental pieces were obtained using the photogrammetry technique. Finally, experience was tested with teachers and students, obtaining satisfactory result.

Bumpy Sliding: An MR System for Experiencing Sliding Down a Bumpy Cliff

Hiroki Tsunekawa: Tokyo Denki University; Akihiro Matsuura: Tokyo Denki University

Booth: Online - Room D: CHANGAN

We present an interactive MR system with which a player can experience a cartoon physical scene of stabbing a knife on the wall, while sliding down a bumpy cliff. We developed two devices: one is a wall device that mimics the cliff surface; and the other is a knife-shaped device with a retractable blade. The speed of the sliding is controlled using the pressure data obtained by the knife device and the impact of hitting a rock is expressed using a solenoid in the handle of the knife.

VRdoGraphy: An Empathic VR Photography Experience

Kunal Gupta: The University of Auckland; Yuewei Zhang: The University of Auckland; Tamil Selvan G: The University of Auckland; Prasanth Sasikumar: The University of Auckland; Nanditha Krishna: Amrita Vishwa Vidyapeetham; Musa Mahmood: OpenBCI, Inc.; Conor Russomanno: OpenBCI, Inc.; Mark Billinghurst: The University of Auckland

Booth: Online - Room D: CHANGAN

This demo presents VRdoGraphy, a VR photography application where the environment and virtual companion adapt to provide empathic interactions. Using the Galea VR HMD with a range of biosensors, we fed the user's EEG, EDA, EMG, and HRV biosignals into a live emotion prediction system estimating the user's emotions. The VR environment and virtual companion adapt their appearance & tone based on the user's emotions. It shows how VR HMDs with integrated biosensors can provide empathic interactions.

Add-on Occlusion: Building an Occlusion-capable Optical See-through Head-mounted Display with HoloLens 1

Yan Zhang: Shanghai Jiao Tong University; Xiaodan Hu: NAIST; Kiyoshi Kiyokawa: Nara Institute of Science and Technology; Xubo Yang: SHANGHAI JIAO TONG UNIVERSITY

Booth: Demo6 - Room D: CHANGAN

Hard-edge occlusion is demonstrated as a key feature for optical see-through head-mounted displays (OSTHMDs), while is currently not supported by any commodity product in the market. In this research demo, we provide HoloLens 1 with a novel add-on occlusion device to build an occlusion-capable optical see-through head-mounted display (OC-OSTHMD). A demo is built in Unity, proving the real-time virtual display with hard-edge occlusion is realized by the integrated system.

Dynamic Scene Adjustment for Player Engagement in VR Game

Zhitao Liu: Center for Future Media, the School of Computer Science and Engineering, UESTC, Chengdu, China; Yi Li: Center for Future Media, the School of Computer Science and Engineering, UESTC, Chengdu, China; Ning Xie: Center for Future Media, the School of Computer Science and Engineering, UESTC, Chengdu, China; YouTeng Fan: Center for Future Media, the School of Computer Science and Engineering, UESTC, Chengdu, China; Haolan Tang: Center for Future Media, the School of Computer Science and Engineering, University of Electronic Science and Technology of China; Wei Zhang: AVIC Chengdu Aircraft Design&Research Institute

Booth: Demo7 - Room D: CHANGAN

Virtual reality (VR) produces a highly realistic simulated environment with controllable environment variables. This paper proposes a Dynamic Scene Adjustment (DSA) mechanism based on the user interaction status and performance, which aims to adjust the VR experiment variables to improve the user's game engagement. We combined the DSA mechanism with a musical rhythm VR game. The experimental results show that the DSA mechanism can improve the user's game engagement (task performance).

Facilitating Asymmetric Interaction between VR Users and External Users via Wearable Gesture-Based Interface

Yuetong Zhao: Beihang University; Shuo Yan: Beihang University; Xuanmiao Zhang: Beihang University; Xukun Shen: Beihang University

Booth: Demo8 - Room D: CHANGAN

In our work, we designed a wearable gesture-based interface for external users in two main asymmetric VR scenarios: (1) Object Transition and (2) Collaborative Game. We conducted a hybrid user study with twenty participants to compare the gesture-based interaction and controller-based interaction for external users. Our study revealed that the participants in gesture-based interaction received more positive feedbacks compared to those in controller-based interaction.

Bringing Instant Neural Graphics Primitives to lmmersive Virtual Reality

Ke Li: Deutsches Elektronen-Synchrotron (DESY); Tim Rolff: Universität Hamburg; Susanne Schmidt: Universität Hamburg; Simone Frintrop: Universität Hamburg; Reinhard Bacher: Deutsches Elektronen Synchrotron DESY; Wim Leemans: Deutsches Elektronen Synchrotron; Frank Steinicke: Universität Hamburg

Booth: Demo9 - Room D: CHANGAN

Neural radiance field (NeRF), in particular, its extension by instant neural graphics primitives is a novel rendering method for view synthesis that uses real-world images to build photo-realistic immersive virtual scenes. Despite its enormous potential for virtual reality (VR) applications, there is currently little robust integration of NeRF into typical VR systems available for research and benchmarking in the VR community. In this poster paper, we present an extension to instant neural graphics primitives and bring stereoscopic, high-resolution, low-latency, 6-DoF NeRF rendering to the Unity game engine for immersive VR applications.

Conference Sponsors

Special

Lujiazui Logo

Diamond

Platinum

Baidu Logo

Gold

Senstime Logo
Unity china
XImmerse Logo
Vivo Logo

Silver

GritWorld logo

Bronze

ImageDerivative logo
yuanjing alibaba logo
raysengine alibaba logo
S-Dream logo
VRIH logo
VRIH logo
evis logo
kanjing logo
lianying logo

Doctoral Consortium Sponsors

Supporters

Tencent Learn logo
lenovo logo
qualcomm logo
liangfengtai logo
hgmt logo

Host

SJTU

Co-Host

ZJU NUIST

Supporting Associations

ccf-vr Logo CSIG-VR Logo cgs-vcc Logo
CCF-VR CSIG-VR CGS-VCC
cvrvt Logo mia Logo siga Logo
CVRVT MIA SIGA
cvrvt Logo
SJMC


Code of Conduct

© IEEE VR Conference 2023, Sponsored by the IEEE Computer Society and the Visualization and Graphics Technical Committee