- Monday March 23rd
- Tuesday March 34th
Monday March 23 - full day
eRis provides a one day tutorial about the art of engineering realtime interactive systems (RIS) in the area of highly interactive systems and perceptual computing typical for Virtual, Augmented, Mixed Reality and computer games. The course covers theoretical models derived from the requirements of the application area as well as common handson and novel solutions necessary to tackle and fulfill these requirements.
The first part of the course will concentrate on the conceptual principles characterizing realtime interactive systems. Questions answered are: What are the main requirements ? How do we handle multiple modalities? How do we define the timeliness of RIS? Why is it important? What do we have to do to assure timeliness? The second part will introduce a conceptual model of the mission critical aspects of time, latencies, processes, and events necessary to describe a system’s behavior. The third part introduces the application state, it’s requirements of distribution and coherence, and the consequences these requirements have on decoupling and software quality aspects in general. The last part introduces some potential solutions to data redundancy, distribution, synchronization, and interoperability. Each part of the course will take 90 minutes.
Along the way, typical and prominent stateofhearth approaches to reoccurring engineering tasks are discussed. This includes pipeline systems, scene graphs, application graphs (aka field routing), event systems, entity and component models, and others. Novel concepts like actor models and ontologies will be covered as alternative solutions. The theoretical and conceptual discussions will be put into a practical context of four of today’s commercial and research systems, i.e., X3D, instant reality, Unity3d, Unreal Engine 4, and Simulator X.
Monday March 23 - morning
A recent trend in interactive environments are large, ultra high resolution displays (LUHRDs). Compared to other large interactive installations, like the CAVE tm , LUHRDs are usually flat or (slightly) curved and have a significantly higher resolution, offering new research and application opportunities.
This tutorial provides information for researchers and engineers who plan to install and use a large ultrahighresolution display. We will give detailed information on the hardware and software of recently created and established installations and will show the variety of possible approaches. Also, we will talk about rendering software, rendering techniques and interaction for LUHRDs, as well as applications.
Tuesday March 24 - full day
This tutorial is designed to present an introduction of augmented reality using an integration of OpenCV (via OpenCVSharp) and Vuforia. We have found ourselves using this combination often when developing Unity-based AR applications, so we decided to develop this tutorial for IEEE VR 2015.
The tutorial covers the use of OpenCV (via OpenCVSharp) and Vuforia with the Unity3D game engine by demonstrating the use of both libraries in tandem to support the development of advanced Augmented Reality (AR) applications. We will provide a step-by-step introduction to OpenCV and Vuforia as well as the key aspects of Unity that support the development of AR applications. We will demonstrate each library by walking through an example application during the morning and afternoon sessions. In both the OpenCV and Vuforia forums, it is common for these two libraries to be used together for the purpose of an Augmented Reality applications as they have complementary capabilities.
OpenCV is a library with capabilities developed for and used in the field of computer vision. OpenCV is extremely useful with the capture and processing of images, however, OpenCV has little in the way of augmenting these images with additional 2D and 3D structures. This limitation can be overcome by using the Vuforia library . Vuforia has a strong set of features to aid in the development of AR applications as well as excellent support for object and pattern detection. Vuforia works by recognizing certain patterns and objects in the real world and it is capable of projecting a 2D or 3D structure onto those objects.
When the tutorial wraps up, the attendees will have a completed game using OpenCV and Vuforia.for a handheld tablet. The game will use OpenCV to capture hand gestures from the forward facing camera and process them for Vuforia to useto augmenting the real world based on the data from OpenCV.
Tuesday March 24 - afternoon
Come learn how to turn a regular Android device into a VR HMD with Unity and your laptop.
This codelab is an introduction to how to take a first person game made in Unity and enhance it to have a Virtual Reality mode using the Google Cardboard SDK for Unity.