K-12 education is currently undergoing a technological revolution creating opportunities for Virtual-, Augmented-, and Mixed-Reality based learning. An increasing number of classrooms are being equipped with interactive whiteboards, tablet devices, and personal student computers. This technology integration will continue to increase as mobile devices penetrate into all socioeconomic strata, and as new VR/AR/MR technologies become affordable to schools. Classroom learning of the future could be assisted by multi-projector systems, touchscreen displays, head-mounted displays, and other immersive technologies.
These technological innovations have the potential to engage students in more effective kinds of learning than compared to traditional approaches, by leveraging the affordances of VR/AR/MR media. Such affordances include the ability to engage students with interactive 3D simulations of real-life phenomena, presenting information that is spatially- and temporally- integrated with real objects, leveraging whole-body motions to depict and reinforce learning content, etc.
One particularly unique strength of these technologies is their ability to teach educational content through Embodied Learning, whereby students use their whole body to understand, experience, and interact with the learning content. Embodied learning can take many forms in which learning happens through motions of the physical body, such as: an handheld-augmented-reality experience where the student movestheir body around a plant, in order to understand its internal structure and explore photosynthesis from different layers of abstraction; or, a CS programming course in which student creations are projected onto the classroom surfaces, where students program and collaborate by physically interacting with each other’s programs; or, an HMD-based virtual-reality experience where the student solves mathematical equations by using their hands to physically move numbers from one side of the equal sign to the another.
Technology developers, HCI researchers, cognitive scientists and learning sciences researchers are beginning to understand the mechanisms and benefits of embodied learning, as well as other unique affordances which make VR/AR/MR especially suited for education. But there are many questions about the integration of such experiences into the classroom, such as: What curriculum topics should (and should not) be addressed through such technologies? What psychological mechanisms underlie embodied learning and other unique affordances of VR/AR/MR technology? How can we design experiences to be usable by children of different ages? How will classroom relationships and pedagogical approaches be influenced by such technologies?
In this workshop we aim to bring together developers and researchers who are interested in creating educational experiences for the classroom of the future. The workshop will enable participants to be exposed to and discuss different approaches for integrating virtual-, augmented- and mixed-reality technologies, specifically focusing on the challenges and potential for embodied learning in the classroom.
Emily Reardon is the Director of User Experience and part of the Learning Design team for Digital Production at Sesame Workshop, the not-for-profit organization behind Sesame Street. Prior to that she was the Director of Design Strategy in the Workshop’s Content Innovation Lab, a small research and development team devoted to exploring emergent technology and new ways for children and families to play and learn. Reardon is an Adjunct Professor at New York University’s Graduate School of Education, where she teaches Architecture of Learning Environments as well as Narrative, Digital Media, and Learning. An Emmy Award-winner for her work at Sesame Workshop, Reardon has contributed to a wide variety of industry initiatives and events, including serving as co-chair of the International Conference on Interaction Design and Children and authoring several peer-reviewed academic publications. Reardon holds a Bachelor of Arts degree in Art/Semiotics as well as English and American Literature from Brown University, and a Master of Arts in Education, Communication, and Technology from New York University.
We welcome thought-provoking position papers, case studies, and preliminary research results on topics related to VR/AR/MR learning:
- VR, AR & MR Technologies and Applications for the Classroom
- Embodied Cognition and Learning
- User Experience Design for Children
- Curriculum-based Educational Applications
- Student-Teacher Relationships and Pedagogical Implications
- Classroom Integration of Technology
We expect the audience will be attendees to the IEEE Virtual Reality 2016 conference, specifically those interested in educational technology:
Deadlines and Submission Format
- Academic researchers in augmented / virtual / mixed reality
- Learning psychologists
- Industry organizations for children’s education
- Teachers and educational researchers
- Informal education technology designers
- Paper submission deadline:***EXTENDED*** February 12, 2016
- Notification of acceptance: February 20, 2016
We seek contributions in the following formats:
I) Research Papers (4-6 pages): Novel results in the field in the above mentioned categories
II) Position Papers (2-4 pages): Interesting and possibly controversial points of view, and approaches to foster a discussion at the event.
Papers must be written in English and follow the IEEE Computer Society format found at:
Non-anonymized submissions should be emailed to email@example.com
The workshop will be a half-day event, consisting of presentations from a keynote speaker and selected authors, and followed by discussions on specific topic of interest to the workshop audience. Authors will be invited to do 5-15 minute presentations prior to the workshop discussions, the duration depending on the number of papers accepted. The focus of discussion topics will be chosen according to the participant submissions. Possible topics: curriculum topics suitable for AR/VR/MR, classroom integration issues, evaluation methodologies, future research, etc.
About the Organizers
- Iulian Radu PhD Candidate in Human Centered Computing, Georgia Institute of Technology
- Dr. Blair MacIntyre Director of Augmented Environments Lab, Georgia Institute of Technology
- Dr. Maribeth Gandy Director of Interactive Media Technology Center, Georgia Institute of Technology
Iulian Radu is a Ph.D. Candidate at the Georgia Institute of Technology. Iulian has extensive previous experience in research and development of children’s technology, as part of his years as a Georgia Institute of Technology doctorate student as well as his industrial experiences with organizations such as PBS Kids and Samsung Electronics. While working with PBS Kids under the Ready To Learn initiative, he has directed the design, research, and production of augmented-reality applications for education, including the augmented-reality educational game Cyberchase Shape Quest (officially featured on the iTunes store, and nominee of the Webby and iKids awards). During his current academic research, he has generated publications about the usability, psychology, and educational aspects of augmented reality for children, and has developed multiple educational AR applications, including the augmented reality extension of the popular Scratch programming environment.
Dr. Blair MacIntyre is a Professor in the School of Interactive Computing at the Georgia Institute of Technology, and directs the GVU Center’s Augmented Environments Lab. His research focuses on developing the potential of augmented reality as a novel technology and new medium for games, entertainment, education and work. He has published more than 100 research papers, is actively involved with industry as a consultant, and is regularly interviewed in the media about augmented reality, games and mobile technology. He received a Ph.D. from Columbia University in 1998, and B.Math and M.Math degrees from the University of Waterloo in 1989 and 1991. He is the recipient of an NSERC Postgraduate Scholarship and an NSF CAREER award.
Dr. Maribeth Gandy is the director of the Interactive Media Technology Center. She is a three-time graduate of the Georgia Institute of Technology, receiving a Ph.D. and MS in Computer Science, and a B.S. in Computer Engineering. She directs the Interactive Media Technology Center, whose research and development focuses on interactive systems for mobile & wearable computing, augmented reality, gaming & entertainment, sensing & pattern recognition, assistive technology, and health systems. Her research emphasis is on augmented reality, specifically authoring and evaluation techniques. She also leads several research projects related to the use of gaming experiences for rehabilitation, wellness, cognitive therapy, training, and assessment.