Research Demos

Demonstration: VR-HYPERSPACE, The Innovative Use of Virtual Reality to Increase Comfort by Changing the Perception of Self and Space

Authors: Mirabelle D'Cruz, Harshada Patel, Laura Lewis, Sue Cobb, Matthias Bues, Oliver Stefani, Tredeaux Grobler, Kaj Helin, Juhani Viitaniemi, Susanna Aromaa, Bernd Froelich, Stephan Beck, André Kunert, Alexander Kulik, Ioannis Karaseitanidis Panagiotis Psonis, Nikos Frangakis, Mel Slater, Ilias Bergstrom, Elena Kokkinara, Betty Mohler, Markus Leyrer, Florian Soyka, Enrico Gaia, Domenico Tedone, Michael Olbert, Mario Cappitelli

The University of Nottingham (UK), Fraunhofer Institute for Industrial Engineering (Germany), Valtion Teknillien Tutkimuskeskus (Finland), Bauhaus‐Universität Weimar (Germany), Institute of Communications and Computer Systems (Greece), University of Barcelona (Spain), Max Planck Institute for Biological Cybernetics (Germany), Thales Alenia Space Italia S.p.A (Italy), EADS Innovation works (Germany)

Abstract: Our vision is that regardless of future variations in the interior of airplane cabins, we can utilize ever-advancing state-of-the-art virtual and mixed reality technologies with the latest research in neuroscience and psychology to achieve high levels of comfort for passengers. Current surveys on passenger’s experience during air travel reveal that they are least satisfied with the amount and effectiveness of their personal space, and their ability to work, sleep or rest. Moreover, considering current trends it is likely that the amount of available space is likely to decrease and therefore the passenger’s physical comfort during a flight is likely to worsen significantly. Therefore, the main challenge is to enable the passengers to maintain a high level of comfort and satisfaction while being placed in a restricted physical space.

Product Accessibility Evaluation using Virtual User Models

Authors: Panagiotis Moschonas, Athanasios Tsakiris, Dimitrios Tzovaras

Information Technologies Institute (Greece)

Abstract: We introduce an open framework for built-in accessibility support at all stages of product development based on the Virtual User Model concept. The goal is to introduce simulation-based and VR testing into the automotive, smart living spaces, workplace, infotainment and personal healthcare applications areas. Our framework also aims to ensure that future products and services are being systematically designed for all people including those with functional limitations. An open simulation platform parted of several tools provides automatic simulation feedback and reporting for guideline/methodologies compliance and quality of service. To achieve this objective, detailed virtual user physical, cognitive, behavioral and psychological models as well as the corresponding simulation models have been integrated to support simulation and testing at all stages of product planning and development.

Geometrically-correct projection-based texture mapping onto a cloth

Authors: Yuichiro Fujimoto, Takafumi Taketomi, Goshiro Yamamoto, Jun Miyazaki, Hirokazu Kato, Ross T. Smith, Bruce H. Thomas

Nara Institute of Science and Technology (Japan), Tokyo Institue of Technology (Japan), University of South Australia (Australia)

Abstract: We demonstrate the geometrically-correct projection-based texture mapping onto a deformable object like a cloth. This system can be used to simulate design that involves change in shape, such as sheets of malleable material. The geometrically-correct projectionbased texture mapping onto a cloth is conducted using the measurement of object’s 3D shape and the detection of the retro-reflective marker on the object’s surface. Rapid prototyping is used as an example application of this projection technique.

Application of Hanger Reflex to wrist and waist

Authors: Takuto Nakamura, Narihiro Nishimura, Michi Sato, Hiroyuki Kajimoto

The University of Electro (Japan), Japan Science and Technology Agency (Japan)

Abstract: When a wire hanger is placed sideways on the head, and the temporal region is sandwiched by the hanger, the head rotates unexpectedly. This phenomenon has been named the “Hanger Reflex”. Although it is a simple method for producing pseudo- force sensation, the use of the wire hanger in this way has up until now been limited in posistion to the head. Here we report a new finding that when a wrist or waist is equipped with a device of a larger circumferance the arm or the body rotates involuntarily. This fact suggests that the Hanger Reflex principle might be applicable to parts of the body other than the head, leading to the possible compact whole-body force display. This paper documents the development and testing of the devices and, suggesting stable presentation of the rotational force

An Ungrounded Tactile Feedback Device to Portray Force and Torque- Like Interactions in Virtual Environments

Authors: Ashley L. Guinan, Markus N. Montandon, Andrew J. Doxon, and William R. Provancher

University of Utah (USA)

Abstract: Our lab has developed a haptic feedback device to provide ungrounded tactile feedback through the motion of actuated sliding plate contactors. Interaction with a virtual environment is provided to a user through a device equipped with tactile feedback and six degree-of-freedom spatial position sensing. Our tactile feedback device is composed of three sliding plate skin stretch displays positioned around the handle, providing feedback to a user’s palm. Our dual-handed tactile feedback system allows independent motion of hands, while providing feedback that creates a kinesthetic experience. We demonstrate fundamental physical interactions such as mass, spring, and damper interactions, which are the building blocks used in every virtual model. Various virtual environments are used to demonstrate physical interactions with objects.

MUSE: Understanding Traditional Dances

Author: Muqeem Khan (Northwestern University in Qatar (Qatar))

Abstract: This demo encapsulates the possible manifestation of Middle Eastern indigenous dance, Al Ardha, in the form of a serious gaming environment. The presentation also illustrates the interconnection and possible transformation of Intangible Cultural Heritage (ICH) content, such as traditional dances, into a digital kinesthetic learning system. The system is called Mimicry Understanding and Safeguarding Environment (MUSE). It is designed to help museum visitors learn traditional or indigenous dances with the help of motion-sensing technologies. MUSE is a multidisciplinary research project and is expected to analyze the intricacies of various indigenous dances, particularly the Arabic sword dance. MUSE interface is expected to facilitate museum visitors’ awareness, learning, and practice of the Al Ardha dance of the Middle Eastern region. Through its easy-to-learn and userfriendly interface, MUSE can facilitate and foster playfulness and user engagement to enhance the experience of museum visitors.

Tablet-Based Interaction Panels for Immersive Environments

Authors: David M. Krum, Thai Phan, Lauren Cairco Dukes, Peter Wang, Mark Bolas

USC Institute for Creative Technologies (USA), Clemson University (USA), Continuum Analytics (USA)

Abstract: With the current widespread interest in head mounted displays, we perceived a need for devices that support expressive and adaptive interaction in a low-cost, eyes-free manner. Leveraging rapid prototyping techniques for fabrication, we have designed and manufactured a variety of panels that can be overlaid on multi-touch tablets and smartphones. The panels are coupled with an app running on the multi-touch device that exchanges commands and state information over a wireless network with the virtual reality application. Sculpted features of the panels provide tactile disambiguation of control widgets and an onscreen heads-up display provides interaction state information. A variety of interaction mappings can be provided through software to support several classes of interaction techniques in virtual environments. We foresee additional uses for applications where eyes-free use and adaptable interaction interfaces can be beneficial.

The Virtual World Framework: Collaborative Virtual Environments on the Web

Authors: Eric Burns, David Easter, Rob Chadwick, David A. Smith, Carl Rosengrant

Lockheed Martin (USA), ADL Colab (USA), OSD (USA)

Abstract: Software distribution and installation is a logistical issue for large enterprises. Web applications are often a good solution because users can instantly receive application updates on any device without needing special permissions to install them on their hardware. Until recently, it was not possible to create 3D multiuser virtual environment-based web applications that didn’t require installing a browser plugin. However, recent web standards have made it possible. We present the Virtual World Framework (VWF), a software framework for creating 3D multiuser web applications. We are using VWF to create applications for team training and collaboration. VWF can be downloaded at http://virtual.wf.

Diplopia: A Virtual Reality Game Designed To Help Amblyopics

Authors: James Blaha, Manish Gupta

Apollo VR (USA)

Abstract: Virtual reality has the potential to measure and help many vision problems. More than 3% of the population have amblyopia, commonly known as lazy eye, a weakness and impairment of vision in one or both of the eyes [1]. Amblyopia often results in a suppression of the information coming from the bad eye, and a loss of stereoscopic vision as a result. It was long thought that people with amblyopia could not improve the vision in their bad eye or gain stereoscopic vision after a critical age of 10-12 years old. Recent research indicates the adult brain is more plastic with regards to suppressiion than previously thought. [2]. Inspired by this, we have built a virtual reality game, called Diplopia, using Unity3D which utilizes the Oculus Rift head-mounted display (HMD) and the Leap Motion controller to help people with amblyopia restore vision in their amblyopic eye.

AR Jigsaw Puzzle with RGB-D Based Detection of Texture-Less Pieces

Authors: João Paulo Lima, João Marcelo Teixeira, Veronica Teichrieb

DEINFO-UFRPE (Brazil), Voxar Labs, CIn-UFPE (Brazil)

Abstract: This demo presents an AR application that helps the user to solve a jigsaw puzzle that consists of non-textured pieces with a discriminative shape. The pieces are detected, their poses are estimated and the ones that are correctly assembled are highlighted. In order to detect the pieces, the Depth-Assisted Rectification of Contours (DARC) method is used, which performs detection and pose estimation of texture-less planar objects using an RGB-D camera.

Ubiquitous Virtual Reality ‘To-Go’

Authors: Aryabrata Basu, Kyle Johnsen

University of Georgia (USA)

Abstract: We propose to demonstrate a ubiquitous immersive virtual reality system that is highly scalable and accessible to a larger audience. With the advent of handheld and wearable devices, we have seen it gain considerable popularity among the common masses.

We present a practical design of such a system that offers the core affordances of immersive virtual reality in a portable and untethered configuration. In addition, we have developed an extensive immersive virtual experience that involves engaging users visually and aurally. This is an effort towards integrating VR into the space and time of user workflows.

Automatic Acquisition and Animation of Virtual Avatars

Authors: Ari Shapiro, Andrew Feng, Ruizhe Wang, Gerard Medioni, Mark Bolas, Evan A. Suma

USC Institute for Creative Technologies (USA), University of Southern California (USA)

Abstract: The USC Institute for Creative Technologies will demonstrate a pipline for automatic reconstruction and animation of lifelike 3D avatars acquired by rotating the user’s body in front of a single Microsoft Kinect sensor. Based on a fusion of state-of-the-art techniques in computer vision, graphics, and animation, this approach can produce a fully rigged character model suitable for real-time virtual environments in less than four minutes.

NASA Telexploration Project Demo

Authors: Jeff Norris, Scott Davidoff

NASA Jet Propulsion Laboratory (USA)

Abstract: NASA’s Telexploration Project seeks to make us better explorers by building immersive environments that feel like we are really there. The Mission Operations Innovation Office and its Operations Laboratory at the NASA Jet Propulsion Laboratory founded the Telexploration Project, and is researching how immersive visualization and natural human-robot interaction can enable mission scientists, engineers, and the general public to interact with NASA spacecraft and alien environments in a more effective way. These efforts have been accelerated through partnerships with many different companies, especially in the video game industry. These demos will exhibit some of the progress made at NASA and its commercial partnerships by allowing attendees to experience Mars data acquired from NASA spacecraft in a head mounted display using several rendering and interaction techniques.

Exhibitors and Supporters

Information on Exhibitors and Supporters

Platinum Level

Silver Level

Bronze Level

Publishers

Sponsors