
Tutorial 1: XR and the Metaverse in Clinical Applications
Saturday, March 8, 2025, 8:30:00-12:30:00 (Saint-Malo France UTC+1) Room: Charcot
Organizers
Mel Slater, DSc - University of Barcelona, Institute of Neurosciences, Event Lab, Spain
Mavi Sanchez-Vives, MD, PhD - Institute of Biomedical Research IDIBAPS - Hospital Clinic, ICREA, Spain
Yannick Prie, PhD - Nantes University, France
Pawel Sniatala, PhD - Poznan University of Technology, Poland
Xueni Sylvia Pan, PhD - Goldsmiths, University of London, United Kingdom
Clinical applications have been one of the greatest and most pervasive uses of extended reality (in particular virtual reality) over the past three decades. Early pioneering work in psychological interventions, such as exposure therapy for phobias and post-traumatic stress disorder, demonstrated the possibility of VR to enhance patient outcomes in a practical way, mainly by using VR to substitute reality for logistic advantages. VR-based rehabilitation paradigms have pointed towards enhancing stroke and spinal cord injury recovery by engaging patients in repetitive, gamified tasks that promote neuroplasticity and improve motor skills. Moreover, VR offers the possibility of enhanced metrics for therapists who can monitor patients in real-time or review their data even if they have carried out exercises at home. VR has offered safer, more realistic training simulations for clinicians and surgeons, reducing reliance on animal or cadaver models and minimizing risks to real patients. XR technologies provide new methodsfor pain management, through distraction therapy as the simplest example. Hunter Hoffman showed decades ago that children with serious burn injuries can have their bandages changed minimizing pain while being engaged in a simple virtual reality snow world. In this tutorial, we will examine current examples of these types of application based on our own experience, and provide our knowledge of designing and testing such clinical XR solutions, exploring new paradigms that exploit ideas that go beyond the bounds of reality - nevertheless achieving clinically useful results.
Tutorial 2: A Practical Guide to Radiance Fields for XR Research and Applications
Saturday, March 8, 2025, 8:30:00-10:15:00 (Saint-Malo France UTC+1) Room: Vauban 1
Organizers
Shohei Mori - Visualization Research Center (VISUS), University of Stuttgart, Germany; Guest Associate Professor (Global)
Ke Li - Human-Computer Interaction Group, Hamburg University, Germany
Mana Masuda - Keio University, Japan
The advent of photorealistic scene representations such as Neural Radiance Fields (NeRF) and 3D Gaussian Splattings (3DGS) enable us to replicate complex real-world elements. This paradigm shift in graphics lets us revisit the way of capturing, storing, and replicating the real world and rethink what to bring in virtual reality (VR). This tutorial provides an opportunity for learning the basics to 'start with' future VR research around the topic. Namely, the attendees will lean the principles of view synthesis using machine learning technology and practical aspects of implementations using open-source projects. It also provides discussions and networking with VR attendees who share the same interests. The tutorial will begin with a general overview, summarizing the latest developments and current trends in radiance field research for XR (X for any type) applications. This will be followed by an in-depth, live demonstration and tutorial on creating basic NeRF and 3DGS scenes for virtual reality (VR) using readily available open-source software such as immersive-ngp and UnityGaussianSplatting toolkit. To conclude, we will summarize the key takeaways and facilitate an interactive discussion with the audience, exploring future trends and developments in radiance fields for XR. https://mediated-reality.github.io/rf4xr/vr25tutorial
Tutorial 3: Building Social Extended Realities with Ubiq
Saturday, March 8, 2025, 10:45:00-12:30:00 (Saint-Malo France UTC+1) Room: Vauban 1
Organizers
Anthony Steed, Ben Congdon, Jingyi Zhang - University College London, United Kingdom
One of the most promising applications of extended reality (XR) technologies is remote collaboration. Social XR applications range from small multiplayer games to large-scale virtual conferences with hundreds of attendees. This tutorial will guide participants in building their own social XR systems using Ubiq, an open-source framework developed at University College London, drawing on decades of experience in practical collaborative virtual environments. The session will begin with fundamental concepts before providing an overview of Ubiq. While basic features are covered in existing online tutorials, this session will focus on newer capabilities such as broad device support, web resource integration, and WebXR deployment. To make the tutorial interactive, we invite participants to submit questions and challenges beforehand, allowing us to incorporate them into the presentation. If you want to know how something might be done in Ubiq, please ask us!
Tutorial 4: Reviewing and Publication Models in the Era of Online Publishing and Generative AI:
Saturday, March 8, 2025, 10:45:00-12:30:00 (Saint-Malo France UTC+1) Room: Vauban 2
Organizers
J. Edward Swan II, PhD - Bagley College of Engineering, Mississippi State University, USA
This tutorial will discuss the social context that mediates scientific communication, including reviewing practices for scientific papers and proposals, and how these practices motivate researchers as they seek career success. The tutorial will survey current and emerging thinking on how this context might be optimally tuned to produce scientific results that are trustworthy and lead to scientific career advancement. In particular, reviewing practices face a number of growing threats, including predatory publication outlets and paper mills that can generate fraudulent papers (now enabled by generative AI). These are in addition to longstanding issues that have affected our own mixed reality community, including ever-increasing reviewing loads, reviewer quality, and reviewer fatigue and burnout. We are not the only scientificcommunity facing these challenges, so a goal of this tutorial is to examine solutions that are being proposed by other communities. In addition, the tutorial will cover the latest thinking from the open science community, a group of scholars that consider the scientific process broadly as a social context that mediates scientific communication.
Tutorial 5: MATRIX: Multimodal Acquisition and Transformation for Realtime Interaction in XR
Saturday, March 8, 2025, 14:00:00-15:45:00 (Saint-Malo France UTC+1) Room: Charcot
Organizers
Sean Andrist, Dan Bohus - Interactive Multimodal AI Systems (IMAIS) Group, Microsoft Research
Cedric Dumas, Aurelien MilliAT - IMT Atlantique, LS2N, UMR CNRS 6004, F-44307 Nantes, France
The MATRIX tutorial focuses on the collection and real-time analysis of data from multiple sensors within extended reality (XR) environments in the context of research experiments. The goal of the tutorial is to give you the key elements to code your own pipeline for your next XR experiment. Evolution of sensors and data analysis technologies enable a faster and more autonomous way to study human behavior. So XR environments can be build with efficient multimodal signals acquisition and synchronization to record complex dataset reflecting users'activity. The growth of machine learning capabilities and the availability of efficient trained models, is a unique opportunity to explore tools that can improve the processing and analysis of these complex datasets enhancing the ability to conduct (real-time) analysis. The tutorial is dedicated to exploring the open source \psi (Platform for Situated Intelligence) framework from Microsoft Research, providing participants with both theoretical insights and hands-on experience. First, Microsoft researchers will give a quick overview of the \psi framework. Then a demonstration and hands-on coding that will focus on integrating \psi within a Unity environment. Participants will be encouraged to share their thoughts and interrogation on how they can include the framework on their own research projects. This tutorial aims to equip participants with practical knowledge and insights, to start using the \psi framework in their current and future research initiatives. Audience should have prior experience to development in C#, Visual Studio and with Unity.
Tutorial 6: Interaction Design for Extended Reality
Saturday, March 8, 2025, 14:00:00-15:45:00 (Saint-Malo France UTC+1) Room: Vauban 1
Organizers
Mark Billinghurst, PhD - University of South Australia, Australia; University of Auckland, New Zealand
Joaquim Jorge, PhD - University of Lisboa, Portugal
The rapid growth of Extended Reality (XR) technologies has revolutionized how users interact with digital environments; however, designing XR systems can be very challenging. This tutorial outlines how Interaction Design principles can be adapted to create intuitive, engaging, and inclusive XR experiences. The tutorial will review Interaction Design techniques focusing on Design and Prototyping methods for XR. The session will provide practical techniques for designing and prototyping user-centered XR systems, with case studies and demonstrations of the latest tools to illustrate best practices. This session aims to equip attendees with the tools and insights to rapidly design and prototype XR experiences for research and real-world applications.
Tutorial 7: Responsible AI and Ethics in Extended Reality for Education
Saturday, March 8, 2025, 14:00:00-18:00:00 (Saint-Malo France UTC+1) Room: Vauban 2
Organizers
Marios Constantinides - CYENS Centre of Excellence, Cyprus & University College London, United Kingdom
Fotis Liarokapis - CYENS Centre of Excellence, Cyprus
Nuria Pelechano - Universitat Politecnica de Catalunya, Spain
Joaquim Jorge - Tecnico Lisboa, Portugal
The rapid evolution of Extended Reality (XR) technologies and the integration of Artificial Intelligence (AI) have unlocked new possibilities for applications across various domains, including education. However, these advancements also bring significant ethical challenges and concerns about fairness and inclusivity. This tutorial explores the intersection of AI, XR, and education, focusing on responsible AI development. The tutorial is structured into four sessions. The first session will introduce the principles of Responsible AI development, addressing challenges such as privacy, bias, explainability, and governance. The second session will explore XR's role in transforming education, highlighting opportunities and barriers to adoption. The third session will examine how generative AI could potentially enhance XR-based education by discussing applications such as AI-generated 3D content, adaptive learning experiences, and virtual tutors. The final session will explore the ethical implications of avatar representations, emphasizing fairness in identity construction and cultural inclusivity. This tutorial will provide participants with a comprehensive understanding of Responsible AI in XR education. It will equip researchers, educators, and developers with practical insights to design ethical, inclusive, and impactful AI-enhanced XR learning environments. Through interactive discussions and real-world case studies, attendees will critically engage with ethical dilemmas and develop actionable strategies for ensuring responsible development in XR-based education. This tutorial has received funding from the European Union's Horizon Europe research and innovation programme under grant agreement No 101093159.
Tutorial 8: Introduction to 3D Sound: A Tutorial on Spatial Audio Perception and Unity Integration
Saturday, March 8, 2025, 16:15:00-18:00:00 (Saint-Malo France UTC+1) Room: Charcot
Organizers
Joanna Luberadzka - Eurecat, Spain
Sound plays a fundamental role in how we perceive and interact with the world. Spatial audio enhances this experience by simulating three-dimensional sound, making virtual environments more immersive and realistic. This tutorial introduces participants to the principles of 3D sound and its integration in VR. The session will be divided into three parts. First, we explore the basics of spatial audio perception, focusing on how the human auditory system localizes sounds. Next, we discuss methods for simulating spatial sound in VR applications. Finally, we will learn about the tools that simplify 3D audio integration in Unity. Attendees will learn how to set up audio sources, modify sound properties through scripting, and explore open-source libraries for spatialization. This introductory tutorial is ideal for researchers and VR developers seeking to enhance their projects with immersive audio. No prior knowledge is required.
Tutorial 9: A Hands-on Introduction to PLUME - a Toolbox to Record, Replay and Analyze XR experiments
Saturday, March 8, 2025, 16:15:00-18:00:00 (Saint-Malo France UTC+1) Room: Vauban 1
Organizers
Charles Javerliat, Sophie Villenave, Pierre Raimbaud, Guillaume Lavoue - Centrale Lyon ENISE, LIRIS CNRS, France
Exhaustive and standardized data collection to complement self-reported data (i.e. questionnaires or semi-structured interviews), is crucial for understanding human behavior in XR experiments. However, researchers often rely on custom, ad-hoc solutions to record behavioral (controller inputs, movements, interactions) and physiological data (EEG, ECG, EDA, HR, etc.). These solutions can impact the application's performance and produce incomplete datasets with non-standard formats. These issues hinder the replication of studies and the production of large-scale datasets for meta-analysis and statistical modeling. To address these limitations, we introduced PLUME, an open-source (https://liris-xr.github.io/PLUME) toolbox for recording and analyzing virtual experiences made with Unity (2D, 3D, XR). PLUME records synchronized behavioral and physiological data, along with their 3D context. The data can be used in PLUME Viewer for contextualized 3D replay and in-situ analysis (trajectories, heatmaps, etc.), and PLUME Python for statistical ex-situ analysis. By facilitating standardized data recording and analysis, PLUME enhances the replicability and the support for large-scale statistical modeling and machine learning applications. This tutorial provides a hands-on introduction to PLUME, guiding attendees through its installation, usage, and analysis capabilities. Participants will learn to: (1) install PLUME Recorder in a Unity project for automated data collection, and create a record, (2) navigate PLUME Viewer to replay and analyze recordings in the context of their virtual environment, and (3) use the PLUME Python API to extract data for downstream tasks. The session will conclude with an open discussion to gather feedback and explore future improvements. This tutorial was supported by the Auvergne-Rhone-Alpes region as part of the PROMESS project and the French National Research Agency as part of the RENFORCE project (ANR-22-CE38-0008).
Tutorial 10: Production of Immersive VR games for Cultural Heritage
Sunday, March 9, 2025, 8:30:00-10:15:00 (Saint-Malo France UTC+1) Room: Vauban 2
Organizers
Prof. Selma Rizvic, PhD - University of Sarajevo - Faculty of Electrical Engineering
Bojan Mijatovic, MA - Sarajevo School of Science and Technology
The goal of this tutorial is to show the production workflow for the creation of a VR serious game for cultural heritage presentation. The game is produced following our Advanced Interactive Digital Storytelling methodology (A-IDS) which extends interactive digital storytelling with gameplays. We will present the whole production workflow, from story idea, creation of scenarios for the educational and gameplay parts, filming of actors on green screen, filming 360 videos, compositing VR videos with actors and ambisonics sound, digitization of museum exhibits, to VR application development and finalization of the product. The tutorial will show how interdisciplinary experts from archeology, history, screenplay writing, filming, visual arts, make-up and costume design, set design, sound and music production, and software development collaborate and create a high-quality educational game for museums. The presented case study is the SHELeadersVR application about female rulers from medieval Balkans countries, installed in five museums in Bosnia and Herzegovina, Serbia, Montenegro, North Macedonia, and Albania.
Tutorial 11: Building and Regulating the Metaverse
Sunday, March 9, 2025, 10:45:00-12:30:00 (Saint-Malo France UTC+1) Room: Vauban 2
Organizers
Anthony Steed - University College London, United Kingdom
Doron Friedman, Dov Greenbaum - Reichman University, Herzliya, Israel
Piotr Skrzypczynski - Pozna? University of Technology, Poland
Marco Gillies - Goldsmiths, University of London, United Kingdom
Raul Ruiz Rodriguez - University of Alicante, Spain
The metaverse is a shared virtual space blending digital and physical realities, accessed through VR/AR or traditional devices, where users socialize or collaborate. First popularized in Neal Stephenson's 1992 novel Snow Crash, elements of the metaverse concept are now visible in social virtual reality applications. Millions use these platforms for socializing and collaboration, though most are siloed systems with limited interconnectivity. A consolidated approach would provide greater accessibility for developers and users. There are many technical challenges to overcome in the move towards metaverse-like platforms. However, ethical and regulatory challenges must be also addressed. As the metaverse becomes more accessible and interconnected, proper regulation and enforcement will be crucial. This tutorial will explore how metaverse-like systems are built and managed, examining the technical and societal challenges in scaling up from today's VR platforms.
Tutorial 12: Imagined Future - Historical view on converging technologies and how scifi's predictions sometimes become reality
Sunday, March 9, 2025, 14:00:00-15:45:00 (Saint-Malo France UTC+1) Room: Vauban 2
Organizers
Santeri Saarinen - Helsinki XR Center, Metropolia UAS, Finland
This tutorial offers a retrospective on different technologies which are currently converging closer to each other in immersive systems. Our goal is to gain a wide view on where different technologies are coming from. We will also take a look at science fiction over the years, to get an understanding how these depictions of technologies might have encouraged researchers to look into how they could be made reality. From 'War of the Worlds' to 'Snow Crash', we will consider innovative technologies presented in fictious works, trying to understand how similar they are to current technology. Starting from early analog technologies, we will go through the main innovations of different sectors: 1.) Computer Science and Artificial Intelligence, looking at the development of early computers and cryptographic tools during World War II as well as large language models and Internet of Things solutions of today. 2.) Networks and Mobile Devices, discussing the development of microwave links to USENET to World Wide Web and 5G networks while looking at the simultaneous evolution of mobile devices. 3.) Virtual Worlds and Gaming, traveling the path of the video games from Spacewar to Pokemon Go, focusing on virtual spaces and multiplayer interaction. 4.) Extended Reality and Spatial Computing, reminiscing over multiple generations of head mounted displays and embodied interaction from Sensorama to Vision Pro. Ending up in the current day, where separate fields are converging, leading to a more heterogenic future where the lines between different technologies are becoming more subtle and systems contain a multitude of different technologies combined together.