The official banner for the IEEE Conference on Virtual Reality + User Interfaces, comprised of a Kiwi wearing a VR headset overlaid on an image of Mount Cook and a braided river.
Research Demos
Did You Do Well? Real-Time Personalized Feedback on Catheter Placement in Augmented Reality-assisted Neurosurgical Training (ID: PO1006)
A Prototype Study on External Human Machine Interface (HMI) for Automated Bus - Preliminary Testing for Effectiveness Verification Using Virtual Reality (VR) (ID: PO1009)
Best Research Demo Award Navigating Realities: Assessing Cross-Reality Transitions Through a Spatial Memory Game in VR and AR Environments (ID: PO1015)
Demonstrating Virtual Streamer with Conversational and Tactile Interaction (ID: PO1021)
ElectricalVRSki: Switching of Ankle Tendon Electrical Stimulation for Dynamic Modulation of Ground Shape Perception (ID: PO1028)
An Eye-tracked XR Visual Deficit Simulation Suite for Ocular Disease Education and Assisted Diagnosis (ID: PO1032)
Low-Fi VR Controller: Bringing 6DOF Interaction to Mobile VR (ID: PO1039)
Demonstrating Mid-Air Ultrasound Haptics with Thermal Display (ID: PO1040)
RoboART: Artistic Robot Programming in Mixed Reality (ID: PO1042)
The Owl: Virtual Teleportation through XR (ID: PO1046)
Ambient Lighting for Improving Passengers' Acceptance of Driverless Buses: Combining The Evaluation in Virtual and Real Environment (ID: PO1050)
Demonstrating Nebula : Dynamic Multi-Scents in 6DoF Virtual Reality (ID: PO1051)
Demonstrating PLUME, a toolbox to Record, Replay, Analyze and Share 6DoF XR Experimental Data (ID: PO1052)
Multimodal Approach for the Diagnosis of Neurodegenerative Disorders Using Augmented Reality (ID: PO1053)
Immersive Behavioral Therapy for Phobia Treatment in Individuals with Intellectual Disabilities (ID: PO1054)
Coffee Masterclass: An Experience of Co-Creation with Prompt Engineering and Generative AI for Immersive Environments Development (ID: PO1055)
Touch My Heart: Navigating the Heart Models in MR with Haptic Feedback, 3D Sound, and Interactive Gamification (ID: PO1056)
MeshReduce: Split Rendering of Live 3D Scene for Virtual Teleportation (ID: PO1058)
Best Research Demo Honorable Mention GPT-VR Nexus: ChatGPT-Powered Immersive Virtual Reality Experience (ID: PO1060)
Design and Implementation of a Mixed Reality Human-Machine Interface for 3D Printer Real-Time Monitoring and Management (ID: PO1063)
Gamified Alzheimer's Disease Diagnosis via Virtual Reality (ID: PO1066)
Visual Guidance for Infant Lumbar Puncture Training in XR (ID: PO1070)
An Immersive Multi-User VR-based System for the Training of Electrical Substation Maintenance (ID: PO1071)
Simulation of an Aware Geriatric Virtual Human in Mixed Reality (ID: PO1072)
DriVR: Extending Driver Training for Persons with Disabilities (ID: PO1073)
BotanicAR: a cooperative experience in Augmented Reality (ID: PO1074)
rlty2rlty: Transitioning Between Realities with Generative Artificial Intelligence (ID: PO1075)
Multi-Projector Dynamic Spatially Augmented Reality on Deformable Surfaces (ID: PO1079)
MUN: an AI-powered multiplayer networking solution for VR games (ID: PO1080)
Demo: Volumetric Motion Annotation and Visualization for Immersive Sports Coaching (ID: PO1081)
Hybrid XRSpectator: A Hybrid Tabletop XR Sports Spectating Experience (ID: PO1091)
GenDeck: Towards a HoloDeck with Text-to-3D Model Generation (ID: PO1092)
Real-Time Virtual Human for Promoting Clinical Trial Education and Recruitment (ID: PO1096)
Integrating Cognitive Behavioral Therapy and Heart Rate Variability Biofeedback in Virtual Reality, Augmented Reality, and Mixed Reality for Stress Reduction (ID: PO1098)

Research Demos Schedule (Timezone: Orlando, Florida USA UTC-4)
Research Demo Booths Open Monday, 18 March 9:45‑10:00, 13:00‑13:30, 15:00‑15:30, 17:00‑19:00 Sorcerer's Apprentice Ballroom
Research Demo Booths Open Tuesday, 19 March 9:45‑10:00, 13:00‑13:30, 15:00‑15:30, 17:00‑17:30 Sorcerer's Apprentice Ballroom
Research Demo Booths Open Wednesday, 20 March 9:45‑10:00, 13:00‑13:30, 15:00‑15:30, 17:00‑17:30 Sorcerer's Apprentice Ballroom
Awards Thursday, 21 March 15:30‑17:00 Fantasia Ballroom H

Research Demos

Did You Do Well? Real-Time Personalized Feedback on Catheter Placement in Augmented Reality-assisted Neurosurgical Training (ID: PO1006)

Sangjun Eom, Duke University; Tiffany Ma, Duke University; Neha Vutakuri, Duke University; Alexander Du, Duke University; Zhehan Qu, Duke University; Joshua Jackson, Duke University; Maria Gorlatova, Duke University

External ventricular drain (EVD) is a common neurosurgical procedure often performed by junior trainees to relieve pressure buildup inside the brain by inserting a catheter to drain fluid. We demonstrate an AR-assisted neurosurgical training tool that provides real-time personalized feedback to trainees based on their manipulation of the surgical environment and their eye gaze patterns. We showcase how real-time AR feedback can assist trainees in improving their EVD performances.


A Prototype Study on External Human Machine Interface (HMI) for Automated Bus - Preliminary Testing for Effectiveness Verification Using Virtual Reality (VR) (ID: PO1009)

Sota Suzuki, National Institute of Advanced Industrial Science and Technology; Yanbin Wu, National Institute of Advanced Industrial Science and Technology; Toru Kumagai, National Institute of Advanced Industrial Science and Technology(AIST); Masaki Masuda, National Institute of Advanced Industrial Science and Technology; Koya Takahashi, National Institute of Advanced Industrial Science and Technology; Naohisa Hashimoto, National Institute of Advanced Industrial Science and Technology; Satoru Ogino, AIST/Tokyo University of Science

Unmanned autonomous vehicles face the issue of losing the communication that previously existed between drivers. External HMI is one of the solutions to this problem. In this study, we are developing external HMI using two VR environments, real-world settings, and actual prototypes. At the demonstration venue, participants can experience driving within the created VR environments.


Best Research Demo Award

Navigating Realities: Assessing Cross-Reality Transitions Through a Spatial Memory Game in VR and AR Environments (ID: PO1015)

Nico Feld, Trier University; Pauline Bimberg, University of Trier; Benjamin Weyers, Trier University; Daniel Zielasko, University of Trier

This tech demo offers an immersive exploration of the most prominent scene transitions within the Reality-Virtuality Continuum (RVC). It delves into the seamless integration of real and virtual worlds, showcasing a spectrum of environments ranging from entirely real to fully virtual and various transitions to switch between them. Our innovative approach centers around an engaging cross-environmental spatial memory game. This game is not just a playful experience but a carefully crafted task d...


Demonstrating Virtual Streamer with Conversational and Tactile Interaction (ID: PO1021)

Vaishnavi Josyula, The University of Texas at Dallas; Sowresh Mecheri-Senthil, The University of Texas at Dallas; Abbas Khawaja, The University of Texas at Dallas; Jose M. Garcia, The University of Texas at Dallas; Ayush Bhardwaj, The University of Texas at Dallas; Ashish Pratap, The University of Texas at Dallas.edu; Jin Ryong Kim, The University of Texas at Dallas

This paper introduces an approach to an interactive virtual streamer that provides conversational and tactile interactions, demonstrating levels of immersion and personalization for the user. User interaction is categorized into various types, such as the spatial nature of the virtual streamer's stream, along with conversational and tactile interactions with the streamer. We demonstrate the virtual streamer in VR and conclude the efficiency of user interactions with the virtual streamer.


ElectricalVRSki: Switching of Ankle Tendon Electrical Stimulation for Dynamic Modulation of Ground Shape Perception (ID: PO1028)

Takashi Ota, The University of Tokyo; Keigo Matsumoto, The University of Tokyo; Kazusa Aoyama, Gunma University; Tomohiro Amemiya Ph.D., The University of Tokyo; Takuji Narumi, the University of Tokyo; Hideaki Kuzuoka, The University of Tokyo

This demonstration introduces ElectricalVRSki, a VR skiing application using ankle tendon electrical stimulation (TES) that presents various ground shape perceptions, dynamically. The novelty of our dynamic sensory presentation system is its employment of ankle TES, which can be controlled dynamically using a computer and its controller are small and lightweight.


An Eye-tracked XR Visual Deficit Simulation Suite for Ocular Disease Education and Assisted Diagnosis (ID: PO1032)

Jason Orlosky, Augusta University; Tommy Bui, Augusta University; Nicole Winston, Augusta University; Wanda Jirau-Rosaly, Augusta University; Shilpa Brown, Augusta University

In this demo, we present a suite of visual deficits implemented in both augmented and virtual reality to assist with education and diagnosis of ocular disorders and deficits. We use eye tracking to better simulate the difficulties in diseases such as macular degeneration, diabetic retinopathy, retinal tears, and other gaze-dependent deficits. Virtual reality (VR) and video see-through augmented reality (VST AR) will give users a unique perspective of life through the eyes of patients.


Low-Fi VR Controller: Bringing 6DOF Interaction to Mobile VR (ID: PO1039)

Kristen Grinyer, Carleton University; Robert J Teather, Carleton University

As virtual reality (VR) becomes an everyday technology, it is important that it remains broadly accessible and affordable. Mobile VR is low-cost and accessible to everyone with a smartphone, but it does not support a 6 degrees of freedom (6DOF) input device nor an effective method to select objects. Our Low-Fi VR Controller is a paper-based 6DOF input device for mobile VR that supports selection via pointing and virtual hand. This demo allows users to test the controller using both techniques.


Demonstrating Mid-Air Ultrasound Haptics with Thermal Display (ID: PO1040)

Yatharth Singhal, University of Texas at Dallas; Haokun Wang, University of Texas at Dallas; Jin Ryong Kim, University of Texas at Dallas

We present a mid-air thermo-tactile display system utilizing ultrasound haptics. Our proof-of-concept design features an open-top chamber, heat modules, and an ultrasound haptic display. The system is demonstrated in four VR environments (campfire, water fountain, kitchen, and candle), showcasing the immersive user experiences achieved through the integration of thermal and tactile feedback.


RoboART: Artistic Robot Programming in Mixed Reality (ID: PO1042)

Felipe Fronchetti, Virginia Commonwealth University; Miles Popiela, Virginia Commonwealth University; Rodrigo O Spinola, Virginia Commonwealth University; Shawn Alan Brixey, Virginia Commonwealth University

To enable artists to incorporate robots in their projects, we propose an end-user-friendly robot programming solution using an intuitive spatial computing environment designed for Microsoft Hololens 2. In our application, the robot movements are synchronized with a hologram via network communication. Using natural hand gestures, users can manipulate, animate, and record the hologram similar to 3D animation software, including the advantages of mixed reality interaction.


The Owl: Virtual Teleportation through XR (ID: PO1046)

Alvaro Villegas, Nokia; Ester Gonzalez-Sosa, Nokia; Pablo Perez, Nokia; Juan Torres, Nokia

The Owl is a prototype of our Virtual Teleportation using XR and immersive video. It combines VR goggles, a reality capture device built with off-the-shelf components and a back-end running on a highly capable network. Participants feel present at a distant location and can interact with people present there, as well as with other remote teleported visitors. Field tested in many different scenarios, we belive it can be the foundation of the next generation human communication applications.


Ambient Lighting for Improving Passengers' Acceptance of Driverless Buses: Combining The Evaluation in Virtual and Real Environment (ID: PO1050)

Satoru Ogino, National Institute of Advanced Industrial Science and Technology; Yanbin Wu, National Institute of Advanced Industrial Science and Technology; Toru Kumagai, National Institute of Advanced Industrial Science and Technology(AIST); Takahiro Miura, National Institute of Advanced Industrial Science and Technology (AIST); Masaki Masuda, National Institute of Advanced Industrial Science and Technology; Koya Takahashi, National Institute of Advanced Industrial Science and Technology; Sota Suzuki, National Institute of Advanced Industrial Science and Technology; Naohisa Hashimoto, National Institute of Advanced Industrial Science and Technology

To address traffic issues, the integration of autonomous vehicles, which operate without an onboard driver, is highly anticipated. However, the social acceptance of driverless vehicles is hindered by passengers' apprehension. Here we propose a system that uses Ambient lighting. To validate the effectiveness of our system, we are conducting experiments in both Virtual Reality (VR) and Real Space, ensuring a comprehensive evaluation.


Demonstrating Nebula : Dynamic Multi-Scents in 6DoF Virtual Reality (ID: PO1051)

Pierre-Philippe Elst, Univ Lyon, Centrale Lyon, CNRS, INSA Lyon, UCBL, LIRIS, UMR5205, ENISE, F-42023; Charles Javerliat, Univ Lyon, Centrale Lyon, CNRS, INSA Lyon, UCBL, LIRIS, UMR5205, ENISE, F-42023; Sophie Villenave, Univ Lyon, Centrale Lyon, CNRS, INSA Lyon, UCBL, LIRIS, UMR5205, ENISE, F-42023; Guillaume Lavoué, Univ Lyon, Centrale Lyon, CNRS, INSA Lyon, UCBL, LIRIS, UMR5205, ENISE, F-42023

Nebula is an open-source olfactory display compatible with autonomous HMDs and capable of diffusing scents at different diffusion rates. It relies on ultrasonic atomizers controlled via pulse-width modulated signals. Although designed to diffuse two odors simultaneously, it has only been evaluated and tested for the diffusion of a single odor so far. In this demonstration, we propose to showcase its capacity to diffuse two odors within a 6 DoF (degrees of freedom) immersive scenario.


Demonstrating PLUME, a toolbox to Record, Replay, Analyze and Share 6DoF XR Experimental Data (ID: PO1052)

Charles Javerliat, Univ Lyon, Centrale Lyon, CNRS, INSA Lyon, UCBL, LIRIS, UMR5205, ENISE, F-42023; Sophie Villenave, Univ Lyon, Centrale Lyon, CNRS, INSA Lyon, UCBL, LIRIS, UMR5205, ENISE, F-42023; Pierre Raimbaud, Univ Lyon, Centrale Lyon, CNRS, INSA Lyon, UCBL, LIRIS, UMR5205, ENISE, F-42023; Guillaume Lavoué, Univ Lyon, Centrale Lyon, CNRS, INSA Lyon, UCBL, LIRIS, UMR5205, ENISE, F-42023

PLUME is an open-source software toolbox (PLUME Recorder, PLUME Viewer, PLUME Python) for recording, replaying, and analyzing behavioral and physiological data from 6DoF (Degrees of Freedom) XR experiments created with Unity. This work has been conditionally accepted for presentation at IEEE VR 2024 as IEEE TVCG paper. The present demonstration aims to showcase the capabilities of PLUME as an easy-to-use, performant tool for XR researchers, and to engage the research community.


Multimodal Approach for the Diagnosis of Neurodegenerative Disorders Using Augmented Reality (ID: PO1053)

Daria Joanna Hemmerling, AGH University of Krakow; Paweł Jemioło, AGH University of Science and Technology; Mateusz Danioł, AGH University of Krakow; Jakub Kamiński, CuraTeX,; Marek Wodziński, AGH University of Krakow; Magdalena Wójcik-Pędziwiatr MD, PhD, Department of Neurology, Andrzej Frycz Modrzewski Krakow University

This study introduces an innovative multi-modal strategy for diagnosing neurodegenerative disorders by integrating Augmented/Mixed Reality. Our primary focus involves harnessing AR goggles to capture Parkinson's Disease symptoms. Through meticulous sensor data and technical intricacies, our system offers a transformative approach to patient care. Our work features an autonomous game-alike experience ' a 30-minute journey encompassing 17 varied tasks seamlessly integrating different skills.


Immersive Behavioral Therapy for Phobia Treatment in Individuals with Intellectual Disabilities (ID: PO1054)

Carlos Cortés, Universidad Politécnica de Madrid; Marta Goyena, Universidad Politécnica de Madrid; Marta Orduna, Nokia Spain; Matteo Dal Magro, Universidad Politécnica de Madrid; Ainhoa Fernández-Alcaide, Fundación Juan XXIII; María Nava-Ruiz, Fundación Juan XXIII; Jesús Gutiérrez, Universidad Politécnica de Madrid; Pablo Perez, Nokia; Narciso García, Universidad Politécnica de Madrid

Adapting traditional behavioral therapy with eXtended Reality (XR) and physiological data allows phobia treatment for individuals with intellectual disabilities. Using systematic desensitization, our tool creates tailored virtual environments for stair-related phobias. Therapists guide the process, monitoring real-time data from the Head-Mounted Display and Empatica E4 wristband during the session. This approach promises a valid transition from classical to immersive therapies.


Coffee Masterclass: An Experience of Co-Creation with Prompt Engineering and Generative AI for Immersive Environments Development (ID: PO1055)

Alexander Rozo-Torres, Universidad Militar Nueva Granada; Wilson J. Sarmiento, Universidad Militar Nueva Granada

This work presents the design and development process of an immersive experience applying a co-creation approach between humans and generative artificial intelligence tools. Coffee Masterclass is an immersive experience that brings anyone to the art and pleasure of preparing specialty coffees. This work tells details of this approach, including how the generative artificial intelligence tools were used in each stage of immersive experience development.


Touch My Heart: Navigating the Heart Models in MR with Haptic Feedback, 3D Sound, and Interactive Gamification (ID: PO1056)

Olha Terendii, SoftServe; Sam Frish, Softserve; Tymur Prokopiev, SoftServe; Daria Joanna Hemmerling, SoftServe; Mateusz Janusz, SoftServe; Tomasz Jadczyk, Division of Cardiology and Structural Heart Diseases, SUM

Touch My Heart utilizes a mixed reality headset and digital twin technology for an immersive exploration of the heart. The system, integrating hand tracking and haptic feedback, offers a realistic simulation, enhancing spatial audio for a fusion of senses. Users can interact with a 3D heart model, providing a comprehensive understanding of structure, function, and abnormalities including cardiac arrhythmia's and valvular heart diseases.


MeshReduce: Split Rendering of Live 3D Scene for Virtual Teleportation (ID: PO1058)

Tao Jin, Carnegie Mellon University; Edward Lu, Carnegie Mellon University; Mallesham Dasari, Northeastern University; Kittipat Apicharttrisorn, Nokia Bell Labs; Srinivasan Seshan, Carnegie Mellon University; Anthony Rowe, Carnegie Mellon University

This demo introduces MeshReduce, an innovative approach that integrates a distributed 3D scene capture technique with a split rendering framework. Our demo shows a cross-platform, live 3D telepresence system that can be viewed on standard web browsers. Our setup consists of multiple depth sensors, capturing users and the background in real-time. MeshFusion uniquely allows for real-time rendering of remotely captured 3D scenes, seamlessly merging them with content processed on the user's device.


Best Research Demo Honorable Mention

GPT-VR Nexus: ChatGPT-Powered Immersive Virtual Reality Experience (ID: PO1060)

Jiangong Chen, Pennsylvania State University; Tian Lan, George Washington University; Bin Li, Pennsylvania State University

The fusion of generative Artificial Intelligence (AI) like ChatGPT and Virtual Reality (VR) can unlock new interaction capabilities through natural language. We introduce GPT-VR Nexus, a novel framework creating a truly immersive VR experience driven by an underlying generative AI engine. It employs a two-step prompt strategy and robust post-processing procedures, without fine-tuning the complex AI model. Our experimental results show quick responses to various user audio requests/inputs.


Design and Implementation of a Mixed Reality Human-Machine Interface for 3D Printer Real-Time Monitoring and Management (ID: PO1063)

Patrick Shane McKelvey II, University of Missouri; Fang Wang, University of Missouri; Jung Hyup Kim, University of Missouri; Yao Yao, University of Missouri- Columbia; Yi Wang, University of Missouri

In this research, a Mixed Reality enabled human-machine interface is designed for 3D printer monitoring and management. Compared to traditional interfaces, we introduce a task-oriented interface to simplify 3D printer operation and minimize potential human errors. We also integrate various sensors to enable real-time monitoring of printer statuses. This methodology can be readily expanded to other manufacturing machines to revolutionize human-machine interaction in the industrial domain.


Gamified Alzheimer's Disease Diagnosis via Virtual Reality (ID: PO1066)

Botao Xiong, University of Science and Technology of China; Nan Li, Anhui Medical University; Yong Liao, University of Science and Technology of China; Pengyuan Zhou, University of Science and Technology of China

VR has been proven as a promising tool for cognitive assessment thanks to its immersive experience. In this regard, we propose a VR Alzheimer's disease (AD) diagnosis method to allow patient to engage in real-life shopping tasks. We design sub-tasks tailored to the targeting user group to evaluate their cognitive status based on the collected metrics. Our method tries to help assess the status of AD, aiming at a feasible and effective reference for physicians regarding recovery status diagnosis.


Visual Guidance for Infant Lumbar Puncture Training in XR (ID: PO1070)

Pranav Sukumar, Columbia University; Siddharth Ananth, Columbia University; Ryan Ethan Friberg, Columbia University; Rohit V Gopalakrishnan, Columbia University; David Kessler, Columbia University Irving Medical Center; Robert Maniker, Columbia University Irving Medical Center; Bettina Schlager, Columbia University; Steven Feiner, Columbia University

Lumbar puncture is a necessary staple of pediatric diagnostic medicine. We developed VirtuaLP, an immersive training system for infant lumbar puncture that runs on a standalone extended reality headset, allowing for a more accessible and flexible training experience. We provide a suite of visual hints to guide the student through the procedure. Additionally, VirtuaLP provides feedback on each component of insertion accuracy, including ideal position, depth, and angle.


An Immersive Multi-User VR-based System for the Training of Electrical Substation Maintenance (ID: PO1071)

Jair Neto, Universidade Federal de Uberlândia; Lucas Pinheiro Moraes, Universidade Federal de Uberlândia; Bruno de Melo, Universidade Federal de Uberlândia; Gerson FlÌÁvio Mendes de Lima, Universidade Federal de Uberlândia; Alexandre Cardoso, Universidade Federal de Uberlândia; Edgard Afonso Lamounier Jr., Universidade Federal de Uberlândia; Jader Oliveira, Eletronorte; Davidson Campos, Eletronorte; Luis dos Santos, Eletronorte

This works presents a multi-user platform to help electrical engineering in substation training. Based on Virtual Reality techniques, the system allows two or more engineers, geographically separate working within a unique environment. This system has been tested with real electrical engineers, and the results show that the proposed system has the potential to minimize efforts during training sessions. Finaly, cost reduction for training is achievable.


Simulation of an Aware Geriatric Virtual Human in Mixed Reality (ID: PO1072)

Asif Ahmmed, New Jersey Institute of Technology; Erica Butts, New Jersey Institute of Technology; Kimia Naeiji, New Jersey Institute of Technology; Ladda Thiamwong, University of Central Florida; Salam Daher, New Jersey Institute of Technology

We present an Augmented Reality (AR) experience, enabling user interaction with a Virtual Human (VH) of an older adult. We demonstrate the feasibility of the technology to foster communication and social connection between caregivers (users) and older adults (the VH). We developed a 3D model of an embodied virtual geriatric patient that demonstrates awareness of its environment and conversations, and implemented a networking protocol for remote response control with a human in the loop.


DriVR: Extending Driver Training for Persons with Disabilities (ID: PO1073)

Adam Jones, Mississippi State University; Woody Watson, Mississippi State University; Timothy Stewart, Mississippi State University; Jacob M Brewington, Mississippi Department of Rehabilitation Services; Connor Chrismond, Mississippi State University; Lalitha Dabbiru, Mississippi State University; Zaccheus Ahonle, Mississippi State University; Emily Wall, Mississippi State University; Kris Geroux, Mississippi State University; Kasee Stratton-Gadke, Mississippi State University

DriVR is a prototype driver training tool for at-home use by persons with disabilities. Comprised of off-the-shelf components, DriVR strives to minimize barriers to entry for people in need of training with assistive technology for driving. In this research demonstration, we introduce our prototype and outline some of its key technical features.


BotanicAR: a cooperative experience in Augmented Reality (ID: PO1074)

Paola Barra, University of Naples Parthenope; Andrea Antonio Cantone, University of Salerno; Rita Francese, University of Salerno; Marco Giammetti, University of Salerno; Raffaele Sais, University of Salerno; Otino Pio Santosuosso, University of Salerno; Aurelio Sepe, University of Salerno; Simone Spera, University of Salerno; Genoveffa Tortora, Università di Salerno; Giuliana Vitiello, Università di Salerno

BotanicAR is a collaborative AR game leveraging the capabilities of Meta Quest 3. It offers a multiplayer experience (2-4 users) who collaboratively fight against a virtual plant within their real environment. The game encourages natural interaction without needing controllers, utilizing hand gestures for enhanced immersion. Real-virtual object occlusion further integrates game elements seamlessly into reality. BotanicAR also demonstrates potential applications in collaborative activities.


rlty2rlty: Transitioning Between Realities with Generative Artificial Intelligence (ID: PO1075)

Matt Gottsacker, University of Central Florida; Gerd Bruder, University of Central Florida; Greg Welch, University of Central Florida

We present a system for visually transitioning a mixed reality (MR) user between two arbitrary realities (e.g., between two virtual worlds or between the real environment and a virtual world). The system uses artificial intelligence (AI) to generate a 360' video that transforms the user's starting environment to another environment, passing through a liminal space that could help them relax between tasks or prepare them for the ending environment. The video can then be viewed on an MR headset.


Multi-Projector Dynamic Spatially Augmented Reality on Deformable Surfaces (ID: PO1079)

Muhammad Twaha Ibrahim, UC Irvine; Aditi Majumder, UC Irvine

We will demonstrate the first multi-projector spatially augmented reality system on non-rigid surfaces like deformable fabrics, human body etc. Using two projectors and two RGB-D cameras, our system adapts the projection from both projectors in realtime based on the surface shape to ensure a seamless, registered display. We will demonstrate two applications of this system: projection on a large deformable fabric, and a surgical guidance system for cleft palate surgery.


MUN: an AI-powered multiplayer networking solution for VR games (ID: PO1080)

Paweł Babiuch, EpicVR sp. z. o.o.; Adrian Łapczyński, Epic VR sp. z o.o.; Hubert Jegierski, EpicVR sp. z o.o.; Maciej Jegierski, EpicVR sp. z o. o.; Barbara Kolber-Bugajska, EpicVR sp. z o.o.; Rafał Salamon, EpicVR sp. z o.o.; Mirosław Płaza, Kielce University of Technology; Stanislaw Deniziak, Kielce University of Technology; Paweł Pięta, Kielce University of Technology; Grzegorz Lukawski, Kielce University of Technology; Artur Jasinski, Kielce University of Technology; Jacek Opałka, Kielce University of Technology; Alicja Marmon, EpicVR sp. z o.o.; Kamil Kwiatkowski, EpicVR sp. z o.o.; Artur Cybulski, EpicVR sp. z o.o.; Magdalena Igras-Cybulska, AGH UST; Paweł Węgrzyn, Jagiellonian University

We present a new toolset named MUN (Metaverse Unity Networking) based on AI algorithms for creating multiplayer VR environments. It includes a Unity asset enabling creation and configuration of the multiplayer mode in VR setup, supplemented with 1) a recommendation engine for optimal programming methods, with error detection in application source code, 2) a user behavior analytics system in multiplayer mode, enhancing realism in avatar and NPC actions using a motion atlas and motion recognition.


Demo: Volumetric Motion Annotation and Visualization for Immersive Sports Coaching (ID: PO1081)

Jiqing Wen, Arizona State University; Qixuan Shi, Arizona State University; Lauren Gold, Arizona State University; Qianyu Ma, Arizona State University; Robert LiKamWa, Arizona State University

The demand for remote sports coaching is on the rise due to location and scheduling constraints. Traditional remote coaching relies on 2D video formats, which limit spatial information and interactive engagement. We introduce our technical demonstration of Augmented Coach, an immersive remote sports training tool that leverages VR enhance the coaching experience. Our demo uses Azure Kinect and Meta Quest to host an athlete-coach interaction over volumetric captures of athletic performance.


Hybrid XRSpectator: A Hybrid Tabletop XR Sports Spectating Experience (ID: PO1091)

Wei Hong Lo, University of Otago; Shazia Gul, University Of Otago; Tobias Langlotz, University of Otago; Holger Regenbrecht, University of Otago; Stefanie Zollmann, University of Otago

In this demonstration, we present Hybrid XRSpectator --- a hybrid combination of Augmented Reality (AR) and Indirect AR for an immersive home-based stadium experience. Sports fans who cannot be physically present at a game can use the application to engage with game information via a Tabletop Stadium AR at home, then seamlessly transition into an indirect AR mode where they are surrounded by a 360-degree video recording of the stadium experience, complete with integrated information and replays.


GenDeck: Towards a HoloDeck with Text-to-3D Model Generation (ID: PO1092)

Manuel Weid, Coburg University; Navid Khezrian, Coburg University of Applied Sciences and Arts; Aparna Pindali Mana, Coburg University of Applied Sciences and Arts; Forouzan Farzinnejad, Coburg University of applied sciences and arts; Jens Grubert, Coburg University

Generative Artificial Intelligence has the potential to substantially transform the way 3D content for Extended Reality applications is produced. Specifically, the development of text-to-3D and image-to-3D generators with increasing visual fidelity and decreasing computational costs is thriving quickly. Within this work, we present GenDeck, a proof-of-concept application to experience text-to-3D model generation inside an immersive Virtual Reality environment.


Real-Time Virtual Human for Promoting Clinical Trial Education and Recruitment (ID: PO1096)

Rashi Ghosh, University of Florida; Andrew H Maxim, University of Florida; Christopher You, University of Florida; Benjamin Lok, University of Florida

Virtual human tech in healthcare is popular for its cost-effectiveness and accessibility. However, existing applications often lack real-time capabilities. Our work introduces a virtual human desktop VR architecture, enabling real-time interactions. Demonstrating dynamic responses to user input, it enhances clinical trial recruitment by providing up-to-date information that can be tailored to diverse populations.


Integrating Cognitive Behavioral Therapy and Heart Rate Variability Biofeedback in Virtual Reality, Augmented Reality, and Mixed Reality for Stress Reduction (ID: PO1098)

Nishu Nath, Montana State University; Laura Stanley, Montana State University; Karl Molina, Montana State University; Angelica Perez, Prisma Health; Jace Zavarelli, Montana State University; Apostolos Kalatzis, Cleveland State University; Camille Lundberg, Prisma Health; Alain Litwin, Prisma Health

We introduce an innovative approach to reducing cognitive stress by integrating two non-pharmacological therapeutic strategies, Cognitive Behavioral Therapy (CBT) videos and Heart Rate Variability Biofeedback (HRV-BF), within Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR). The research demo presents a virtually created mental health therapist situated within a virtual therapy office who will deliver evidence-based CBT, which has been found to be efficacious.

IEEE  IEEE Computer Society IEEE Visualization and Graphics Technical Community

Student Participation
Support Student Participation
Special
UCF
Silver
JP Morgan Chase & Company
Bronze
Christie
UA Little Rock, Emerging Analytics Center
TECHVIZ

Code of Conduct

©IEEE VR Conference 2024, Sponsored by the IEEE Computer Society and the Visualization and Graphics Technical Committee