IEEE Virtual Reality 2019 Keynote Talks
IEEE VR 2019: the 26th IEEE Conference on Virtual Reality and 3D User Interfaces
March 23-27, 2019, Osaka, Japan
Shinya Nishida, Ph.D.
NTT Communication Science Laboratories
Monday, Mar. 25, 2019
Hacking Human Visual Perception
ABSTRACT: Roughly speaking, there are two strategies to provide users with a virtual realistic perceptual experience. One is to make the physical input to the user’s sensory systems close to that of the real experience (physics-based approach). The other one, which sensory scientists (like us) prefer, is to make the response pattern of the users’ sensory system close to that of the real experience (perception-based approach). Using cognitive/neuro-scientific knowledge about human visual processing, we are able to control cortical perceptual representations in addition to sensor responses, and then achieve perceptual effects that would be hard to obtain with the straightforward physics-based approach. For instance, recent research on human material perception has suggested simple image-based methods to control glossiness, wetness, subthreshold fineness and liquid viscosity. Deformation Lamp/Hengento (Kawabe et al., 2016) is a projection mapping technique that can produce an illusory movement of a real static object. Although only a dynamic gray-scale pattern is projected, it effectively drives visual motion sensors in the human brain, and then induces a “motion capture” effect on the colors and textures of the original static object. In Hidden Stereo (Fukiage et al., 2017), multi-scale phase-based binocular disparity signals effectively drives human stereo mechanisms, while the disparity-inducing image components for the left and right images are cancelled out with each other when they are fused. As a result, viewers with stereo glasses perceive 3D images, while those without glasses can enjoy 2D images with no visible ghosts. I will discuss how vision science helps virtual reality technologies, and how vision science is helped by application to the cutting-edge technologies.
BIO: Shin’ya Nishida studied psychology at Graduate School of Letters, Kyoto University, Japan, until 1990. After spending two years at ATR Auditory and Visual Perception Laboratories, Japan, he joined NTT Basic Research Laboratories in 1992. He is now Senior Distinguished Scientist, and Group Leader of Sensory Representation Research Group, NTT Communication Science Labs. He was Honorary Research Fellow of University College London (1997-1998), Visiting Professor of Tokyo Institute of Technology (2006-2012), and President of Vision Society of Japan (2014-2018). He is Member of Science Council of Japan, Leading Researcher of Innovative Shitsukan Science and Technology, Honorary Professor of Nottingham University, and Editorial Board Member of Journal of Vision. He was awarded Japan Society for the Promotion of Science Prize (2006), Japanese Psychological Association International Prize (2006), and MEXT Prize for Science and Technology (2015). His expertise mainly lies in psychophysical study on human sensory processing, including material perception, motion perception, cross-attribute integration and time perception, yet he has a broad interest in cognitive science, neuroscience, and information science and technologies.
University of Tsukuba / Pixie Dust Technologies, Inc., Japan
Tuesday, Mar. 26, 2019
Virtual Reality for Enhancing Human Perceptional Diversity Towards an Inclusive Society
ABSTRACT: We conducted research project towards an inclusive society from the viewpoint of the computational assistive technologies. This project aims to explore AI-assisted human-machine integration techniques for overcoming impairments and disabilities. By connecting assistive hardware and auditory/visual/tactile sensors and actuators with a user-adaptive and interactive learning framework, we propose and develop a proof of concept of our “xDiversity AI platform” to meet the various abilities, needs, and demands in our society. For example, one of our studies is a wheelchair for automatic driving using "AI technology" called "tele wheelchair". Its purpose is not fully automated driving but labor saving at nursing care sites and nursing care by natural communication. These attempts to solve the challenges facing the body and sense organs with the help of AI and others. In this keynote we explain the case studies and out final goal for the social design and deployment of the assistive technologies towards an inclusive society.
BIO: Born in 1987. Ph.D.(Applied Computer Science, University of Tokyo in 2 years as fastest record). Advisor to the President, Assistant Professor and Research Head of Digital Nature Group at University of Tsukuba, Visiting Professor at Osaka University of Arts, and Visiting Professor at the Digital Hollywood University. CEO of Pixie Dust Technologies Inc. Awarded in the World Technology Award 2015, the Prix Ars Electronica 2016, EU STARTS Prize, Selected in Leaders of Tomorrow, Best Knowledge Pool by St. Gallen Symposium, Global Shapers by World Economic Forum and honored in many award and selections. Covered on Nature Index Japan 2017, Axis Magazine, and appeared on CNN, BBC, Discovery, CNBC, Reuter and over 100 famous media. Author of Century of Enchantment (PLANETS), and Message to the Futurists of Tomorrow (Shogakukan).
Junichiro Koyama and Yukiharu Tamiya
BANDAI NAMCO Amusement Inc., Japan
Wednesday, Mar. 27, 2019
Let's Unleash Entertainment! VR Possibilities Learned through Entertainment Facility "VR Zone"
ABSTRACT: We have developed and operated 23 different VR activities while developing/expanding our VR Entertainment Facility VR ZONE throughout the years in Odaiba (2016), Shinjuku (2017), and Osaka (2018). Leaning on these experiences, we will share some of our know-how regarding VR Entertainment's qualities/development as well as future possibilities of VR Entertainment.
VR ZONE introduction
What does VR bring to Entertainment
- From a technology standpoint
- From a media standpoint
Our process for developing popular content
- Picking a Theme
- Experience Design
- Dream Wild Finally...
Mr. Koyama joined NAMCO Ltd. (now BANDAI NAMCO Amusement Inc.) in 1990. He took
part in creating simulation games as a mechanical engineer. In 1992, he experimented on
catering the overseas VR game “VIRTUALITY” to the Japanese market. Subsequently, he
began to pursue virtual reality technology, developing simulation machines in the VR
developing headquarters. He produced many arcade games with innovative concepts such
as “THE iDOLM@STER” and “Mobile Suit Gundam: Bonds of the Battlefield”. Since 2015
he is known as “Director Koya” the overseer of “Project i Can”, a project aiming to discover
and expand unknown areas of entertainment using VR technology.
Mr. Tamiya joined NAMCO Ltd. (now BANDAI NAMCO Amusement Inc.) in 1998. He has
since been working as a planner and developer for a wide range of games: focusing on
arcade games such as the “Dragon Chronicle” series, and the “Dragon Ball ZENKAI”
series, but also taking part in consumer and mobile games. He is namely known for his
hand in concept planning for new projects, and has been directing VR activities as
“Manager Tamiya” since 2015 for “Project i Can”, a project aiming to discover and expand
unknown areas of entertainment using VR technology.