The official banner for the IEEE Conference on Virtual Reality + User Interfaces, comprised of a Kiwi wearing a VR headset overlaid on an image of Mount Cook and a braided river.

Keynote Speakers

Date
Tim Dwyer Monday, March 14 - 10:30 (NZDT (UTC+13))
Aliesha Staples Tuesday, March 15 - 14:30 (NZDT (UTC+13))
Paul Debevec Wednesday, March 16 - 11:00 (NZDT (UTC+13))


Tim Dwyer
Monash University

Photo of Tim Dwyer

Immersive Analytics and Embodied Sensemaking
Monday, March 14 - 10:30, NZDT (UTC+13)


Abstract
Immersive Analytics explores the use of emerging display and interaction technologies to bring data out of computers and into the world around us. Tracked VR headset displays offer some distinct advantages over traditional desktop data visualisation such as true 3D rendering of spatial data and natural physical navigation. AR headsets offer further advantages, such as the possibility of embedding data visualisations into our natural environment or workplace. Another advantage that has been touted for immersive representation of data and complex systems is the idea that "embodied interaction" supports sensemaking. However, "sensemaking" is a very high-level cognitive activity and strong links between embodiment and sensemaking are not well established. In this talk we first review systems and techniques for immersive analytics, particularly those from the Data Visualisation and Immersive Analytics lab at Monash University, and then we look more closely at developments in understanding "embodied sensemaking". We argue that a better understanding of how embodiment relates to sensemaking will be key to creating a new generation of tools that help people to work effectively in an increasingly complex and data rich world.

Bio
Professor Tim Dwyer is a co-editor of "Immersive Analytics", which was published by Springer in 2018 and has had over 36k downloads to date. He received his PhD on "Two and a Half Dimensional Visualisation of Relational Networks" from the University of Sydney in 2005. He was a post-doctoral Research Fellow at Monash University from 2005 to 2008, Tim was also a Visiting Researcher at Microsoft Research USA until 2009. From 2009 to 2012, Tim was a Senior Software Development Engineer with the Visual Studio product group at Microsoft in the USA. Then, he returned to Monash as a Larkins Fellow where he now directs the Data Visualisation and Immersive Analytics Lab.



Aliesha Staples
StaplesVR

Photo of Aliesha Staples

Making Business with Mixed Reality - The Story of StaplesVR
Tuesday, March 15 - 14:30, NZDT (UTC+13)


Abstract
I will describe the story of StaplesVR and how to build a business model around emerging technologies such as AR & VR. I will explain on how we moved from a camera equipment company into a global footprint AR/VR software company from New Zealand. I will talk about some of our world firsts such as our 360 fire proof camera and our cyber scanning system which is used to create photorealistic humans as well as our VR training platform which is pivoting us from a service for hire company to a SaaS business model.

Bio
Aliesha Staples is the founder and director of StaplesVR an emerging technology company that has grown to cater to an international clientele and has been awarded multiple accolades for its work. Aliesha is highly sought after both locally and abroad as a producer of AR/VR content and has produced projects for studios and television networks including Warner Brothers, Paramount Pictures, TVNZ and ABC. She was the first female to win the high-tech awards young achiever award in 2017 and the only person who has won the award twice when she won again in 2018, she is a New Zealand of the year innovation finalist and a next magazine woman of the year finalist. Her work spans entertainment, medical, aviation, health and safety and more. Aliesha is also the Co-Founder of Click Studios a social enterprise co-working hub designed to curate creative tech companies and grow both the companies in the Click community and the creative technology industry in New Zealand.



Paul Debevec
USC ICT and Netflix

Photo of Paul Debevec

Light Fields, Light Stages, and the Future of Virtual Production
Wednesday, March 16 - 11:00, NZDT (UTC+13)


Abstract
I'll describe recent work at Netflix, Google, and the USC Institute for Creative Technologies to bridge real and virtual worlds through photography, lighting, and machine learning. I'll begin by describing Welcome to Light Fields, the first downloadable Virtual Reality light field experience. I'll then describe DeepView, Google's solution for Light Field Video, providing immersive VR video you can move around in after it's been recorded, with subjects close enough to be within arm's reach. I'll also present how Google's new Light Stage system paired with Machine Learning techniques is enabling new techniques for lighting estimation from faces for AR and interactive portrait relighting on mobile phone hardware. I will finally talk about how all of these techniques may enable the next generation of virtual production filmmaking, infusing both light fields and relighting into the real-world image-based lighting LED stages now revolutionizing how movies and television are made.

Bio
Paul Debevec is Netflix's Director of Research, Creative Algorithms and Technology overseeing the creation of new technologies in computer vision, computer graphics, and machine learning for virtual production, visual effects, and animation. His 2002 Light Stage 3 system at the USC Institute for Creative Technologies was the first to surround actors with color LED lighting driven by images of virtual locations for virtual production. Techniques from Paul's work have been used to create key visual effects sequences in The Matrix, Spider-Man 2, Benjamin Button, Avatar, Gravity, Furious 7, Blade Runner: 2049, Gemini Man, Free Guy, numerous video games, and to record a 3D Portrait of US President Barack Obama. His light stage facial capture technology has helped numerous companies create photoreal digital actors and build machine learning datasets for synthetic avatars. Paul's work in HDR imaging, image-based lighting, and light stage facial capture has been recognized with two technical Academy Awards and SMPTE's Progress Medal. Paul is a Fellow of the Visual Effects Society and a member of the Television Academy's Science and Technology Peer Group, and has served on the Motion Picture Academy's Visual Effects Branch Executive Committee and Science and Technology Council, and as Vice President of ACM SIGGRAPH. More info at https://www.pauldebevec.com/


Conference Sponsors

Diamond

Virbela Logo. Their name next to a stylised green, red, and blue circle.

Gold

ChristchurchNZ Logo. Their name is written in a red font.

iLRN Logo. Their name is written in an orange font.
University of Canterbury Logo. Their name is written in a red font.

Silver

The Qualcomm logo. Their name is written in a blue font.

Bronze

HITLab NZ logo. A Kiwi wearing a VR headset.

Supporters

ARIVE logo. Their name is written next to picture of Australia and New Zealand.

Multimodal Technologies and Interaction logo. Their name is written next to a stylised blue M.

NVIDIA's logo. Their name is written under a green and white spiral with a green rectangle covering its right half.

Pico's logo.

XR bootcamp logo.

Doctoral Consortium Sponsors

The National Science Foundation's logo. NSF is written in white over a globe surrounding by silhouettes of people holding hands.

Conference Partner


Code of Conduct

© IEEEVR Conference