Image of the Belem Tower in Lisbon along with a view for the Tagus river. And the Logo for IEEEVR Lisbon 2021 with the moto: make virtual reality diverse and acessible.
Research Demos
Boarding Sensation Presentation of the Biped Walking Robot with a Low-cost Two-axis Motion Platform
Virtual Equipment System: Face Mask and Voodoo Doll for User Privacy and Self-Expression Options in Virtual Reality
Demonstrating High-Precision and High-Fidelity Digital Inking for Virtual Reality
Virtual Reality for Remote Controlled Robotics in Engineering Education
Development of a Virtual Reality Assessment of Visuospatial Function and Oculomotor Control
A Real-time approach to improve drilling decision-making process using virtual reality visualizations
Shared Augmented Reality Experience Between a Microsoft Flight Simulator User and a User in the Real World
Real-time Mixed Reality Teleconsultation for Intensive Care Units in Pandemic Situations
Turning a Messy Room into a Fully Immersive VR Playground
Demonstrating Rapid Touch Interaction in Virtual Reality through Wearable Touch Sensing
Virtual Control Interface: Discover and Control IoT devicesintuitively through AR glasses with Multi-model Interactions
Revealable Volume Displays: 3D Exploration of Mixed-Reality Public Exhibitions
Magnoramas
Visualizing Planetary Spectroscopy through Immersive On-site Rendering
Take me to the event:

Virbela Location: Hall A and Hall B (MAP)
Discord Channel: Open in Browser, Open in App (Participants only)

Best of IEEE VR 2021

Please use this form to vote for the best poster, best demo, and best 3DUI contest submission.

Vote!

Boarding Sensation Presentation of the Biped Walking Robot with a Low-cost Two-axis Motion Platform

Kyosuke Mori: Hiroshima City University; Wataru Wakita: Hiroshima City University

Booth: C23 - Expo Hall B

Teaser Video: Watch Now

We render a boarding sensation of a biped robot at low cost and high immersive by approximate the 6-DOF motion such as the impact, vibration, and steep slope experienced on boarding a biped robot to a 2-DOF rolling motion at max ± 25 degrees in both at pitch and roll directions with our low-cost two-axis motion platform.

Virtual Equipment System: Face Mask and Voodoo Doll for User Privacy and Self-Expression Options in Virtual Reality

Powen Yao: University of Southern California; Vangelis Lympouridis: USC; Michael Zyda: USC

Booth: C26 - Expo Hall A

Teaser Video: Watch Now

Current trends in immersive technologies suggest an increase in capturing user’s data to drive interactions and avatar representations. With growing numbers of data types being collected, users need an easy way to view and control their privacy settings. In this demo, we present a method for users to adjust options related to privacy settings, user data collection, and self-expression through the use of 3D user interface metaphors such as a mask and a voodoo doll.

Demonstrating High-Precision and High-Fidelity Digital Inking for Virtual Reality

Hugo Romat: ETH Zurich; Andreas Rene Fender: ETH; Manuel Meier: ETH Zürich; Christian Holz: ETH Zürich

Booth: C28 - Expo Hall B

Digital pen interaction has become a first-class input modality for precision tasks such as writing, annotating, and drawing. In Virtual Reality, however, input is largely detected using cameras which does not nearly reach the fidelity we achieve with analog handwriting. In this paper, we present Flashpen, a digital pen for VR whose sensing principle affords accurately digitizing hand-writing.

Virtual Reality for Remote Controlled Robotics in Engineering Education

Andrew Rukangu: University of Georgia; Alexander James Tuttle: University of Georgia; Kyle Johnsen: University of Georgia

Booth: C23 - Expo Hall A

There is a high demand for high-end lab equipment in engineering education, especially for courses that require practical hands-on lab exercises. However, this equipment is quite expensive which forces some institutions to seek other alternatives or forego them altogether. In this work, use virtual and augmented reality to build and test a remote UR-10 based robotics lab that allows students to work together on a hands-on robotics-based lab.

Development of a Virtual Reality Assessment of Visuospatial Function and Oculomotor Control

Garima Adlakha: University of Southern California; Sanya Singh: University of Southern California; Kranthi Nuthalapati: University of Southern California; Apoorva Aravind Patil: University of Southern California; Prajakta Khandve: University of Southern California; Pushpak Bhattacharyya: University of Southern California; Saravanan Manoharan: University of Southern California; Sanjay Mallasamudram Santhanam: University of Southern California; Isaiah J Lachica: University of Southern California; James M. Finley: University of Southern California; Vangelis Lympouridis: USC

Booth: C22 - Expo Hall A

Teaser Video: Watch Now

This demo uses Virtual Reality (VR) to assess cognitive function in people with Parkinson's disease. We developed a VR-based assessment that combines simple game mechanics with components of the Trail Making Test. We collect performance metrics and gaze analytics during gameplay using the HTC Vive Pro Eye system. Ultimately, this data will allow clinicians and researchers to characterize cognitive and visuomotor deficits in people with neurological impairments such as Parkinson's disease.

A Real-time approach to improve drilling decision-making process using virtual reality visualizations

Thiago Malheiros Porcino: SENAI ISI SVP - Firjan; Márcia M. Dórea: SENAI ISI SVP - Firjan; Diego Barboza: SENAI ISI SVP - Firjan; Wesley Oliveira: SENAI ISI SVP - Firjan; Eric Romani: SENAI ISI SVP - Firjan; Fernando Perin Munerato: Repsol Sinopec Brazil; João H. Batista: Repsol Sinopec Brazil

Booth: C27 - Expo Hall A

Teaser Video: Watch Now

Virtual reality (VR) is one of the key Industry 4.0 trends and is being largely used for training and simulations. A VR environment can reduce training and drilling analysis costs, and help operators and coordinators to monitor the trajectory and other operational variables during the drilling process. This paper presents Divisor, a virtual reality tool for monitoring variables and analyzing historical and real-time data while drilling a new oil well.

Shared Augmented Reality Experience Between a Microsoft Flight Simulator User and a User in the Real World

Christoph Leuze: Nakamir Inc; Matthias Leuze: Alpinschule Innsbruck

Booth: C21 - Expo Hall A

Teaser Video: Watch Now

Our demo consists of an application that allows a user with an AR display (smartphone or Hololens 2) to watch another user, flying an airplane in the Microsoft Flight Simulator 2020 (MSFS), at their respective location in the real world. To do that, we take the location of a plane in MSFS, and stream it via a server to a mobile AR device. The mobile device user can then see the same 3D plane model move at exactly that real world location, that corresponds to the plane’s virtual MSFS location.

Real-time Mixed Reality Teleconsultation for Intensive Care Units in Pandemic Situations

Daniel Roth (Computer Aided Medical Procedures and Augmented Reality); Kevin Yu (Research Group MITI); Frieder Pankratz (LMU); Gleb Gorbachev (Computer Aided Medical Procedures and Augmented Reality); Andreas Keller (Computer Aided Medical Procedures and Augmented Reality); Marc Lazarovici (Institut für Notfallmedizin); Dirk Wilhelm (Research Group MITI); Simon Weidert (Orthopedic Trauma Surgery, Ludwig-Maximillian University); Nassir Nawab (Computer Aided Medical Procedures and Augmented Reality); Ulrich Eck: Computer Aided Medical Procedures and Augmented Reality

Booth: C27 - Expo Hall B

This demo depicts a COVID-19 ICU station patient visit. Through our system, remote experts can join a COVID -19 ICU patient visit without physically moving in the hospital, which avoids gatherings and personnel traffic and optimizes resources.

Turning a Messy Room into a Fully Immersive VR Playground

Naoki Matsuo: Kwansei Gakuin University; Masataka Imura: Kwansei Gakuin University

Booth: C22 - Expo Hall B

Teaser Video: Watch Now

In this study, to enable a VR experience with an HMD even in a space with obstacles, we constructed a reality-based VR space in real time that does not impair the worldview even in a space with obstacles. In addition, we aim to construct a VR space that is easier to recognize by classifying ``objects that are boundaries of space'' and ``ordinary obstacles'' using a deep learning network and superimposing virtual objects corresponding to each type of real object.

Demonstrating Rapid Touch Interaction in Virtual Reality through Wearable Touch Sensing

Manuel Meier: ETH Zürich; Paul Streli: ETH Zürich; Andreas Rene Fender: ETH Zürich; Christian Holz: ETH Zürich

Booth: C26 - Expo Hall B

Teaser Video: Watch Now

We bring quick touch interaction to Virtual Reality, illustrating the beneficial use of rapid tapping, typing, and surface gestures for Virtual Reality. The productivity scenarios that become possible are reminiscent of apps that exist on today's tablets. We use a wrist-worn prototype to complement the optical hand tracking from VR headsets with inertial sensing to detect touch events on surfaces. Our demonstration comprises UI control in word processors, web browsers, and document editors.

Virtual Control Interface: Discover and Control IoT devicesintuitively through AR glasses with Multi-model Interactions

Zezhen Xu: University of Southern California; Vangelis Lympouridis: USC

Booth: C25 - Expo Hall A

Teaser Video: Watch Now

The number of smart home devices will increase exponentially. The current Internet of Things (IoT) control interfaces on smartphones are spatially separated from the devices they operate, making them less intuitive and progressively more complicated. We developed VCI, a Virtual Reality (VR) simulation for HCI researchers to explore multimodal interactions with IoT in a future smart home setting using virtual control interfaces projected on emulated AR glasses.

Revealable Volume Displays: 3D Exploration of Mixed-Reality Public Exhibitions

Fatma Ben Guefrech: Université de Lille; Florent Berthaut: Université de Lille; Patricia Plénacoste: Université de Lille; Yvan Peter: Université Lille 1; Laurent Grisoni: University of Lille

Booth: C24 - Expo Hall B

Teaser Video: Watch Now

We present a class of mixed-reality displays which allow for the 3D exploration of content in public exhibitions, that we call Revealable Volume Displays (RVD). They allow visitors to reveal information placed freely inside or around protected artefacts, visible by all, using their reflection in the panel. We first discuss the implementation of RVDs, providing both projector-based and mobile versions. We then present a design space that describes the interaction possibilities that they offer. Drawing on insights from a field study during a first exhibition, we finally propose and evaluate techniques for facilitating 3D exploration with RVDs.

Magnoramas

Kevin Yu: Research Group MITI; Alexander Winkler: Technical University of Munich; Frieder Pankratz: LMU; Marc Lazarovici: Institut für Notfallmedizin; Prof. Dirk Wilhelm: Research Group MITI; Ulrich Eck: Computer Aided; Medical Procedures and Augmented Reality; Daniel Roth: Computer Aided Medical Procedures and Augmented Reality; Nassir Navab: Technische Universität München

Booth: C21 - Expo Hall B

Teaser Video: Watch Now

We introduce Magnoramas, an interaction method for creating supernaturally precise annotations on virtual objects. We evaluated Magnoramas in a collaborative context in a simplified clinical scenario. Teleconsultation was performed between a remote expert inside a 3D reconstruction and embodied by an avatar in Virtual Reality that collaborated with a local user through Augmented Reality. The results show that Magnoramas significantly improve the precision of annotations while preserving usability and perceived presence measures compared to the baseline method. By additionally hiding the physical world while keeping the Magnorama, users can intentionally lower their perceived social presence and focus on their tasks.

Visualizing Planetary Spectroscopy through Immersive On-site Rendering

Lauren Gold: Arizona State University; Alireza Bahremand: Arizona State University; Connor Richards: Arizona State University; Justin Hertzberg: Arizona State University; Kyle Sese: Arizona State University; Alexander A Gonzalez: Hamilton High School; Zoe Purcell: Arizona State University; Kathryn E Powell: Northern Arizona University; Robert LiKamWa: Arizona State University

Booth: C24 - Expo Hall A

Teaser Video: Watch Now

Planetary Visor is our virtual reality tool to visualize orbital and rover-based datasets from the ongoing traverse of the NASA Curiosity rover in Gale Crater. Data from orbital spectrometers provide insight about the composition of planetary terrains. Meanwhile, Curiosity rover data provide fine-scaled localized information about Martian geology. By visualizing the intersection of the orbiting instrument's field of view with the rover-scale topography, and providing interactive navigation controls, Visor constitutes a platform for users to intuitively understand the scale and context of the Martian geologic data under scientific investigation.

Conference Sponsors

Diamond

Virbela Logo

Gold

Instituto Superior Tecnico
immersive Learning Research Network

Silver

Qualcomm Logo

Bronze

Vicon Logo
Hitlab logo
Microsoft Logo
Appen Logo
Facebook Reality Labs Logo
XR Bootcamp Logo

Supporter

GPCG Logo
Inesc-id Logo
NVIDIA Logo

Doctoral Consortium Sponsors

NSF Logo
Fakespace Logo

Conference Partner

CIO Applications Europe Website


Code of Conduct

© IEEEVR Conference