Each edition of “ITS In-Depth” takes a closer look at a hot topic in information technology. With extended reality (XR) rapidly blurring the lines between science fiction and, well, reality, we talked with ITS Online Learning Services Instructional Analyst Jason Webb G’18 to learn how XR is shaping teaching, learning and research at Syracuse University.

What is the difference between virtual reality, augmented reality, mixed reality, and extended reality? Are any of those terms interchangeable or do they each mean a specific thing? 

Extended reality is the umbrella term for virtual, mixed and augmented reality. Reality is reality, but extended reality adds to that space or substitutes for it by enhancing it with a digital interface. Augmented reality can add digital elements. Mixed reality is a combination of being in reality but interfacing in a digital world. Virtual reality is the farthest out in that the user is immersed in a digital world that creates the illusion of reality.

Overview of Extended Reality Varieties

TypeDescription
Extended reality
The entire experience from completely virtual to completely real
Virtual reality
User is completely immersed into a virtual environment
Mixed reality“Environment aware” 2D or 3D content is overlaid onto the physical space
Augmented reality
“Non-environment aware” 2D or 3D content is overlaid onto the physical space

Is extended reality becoming more commonplace in higher education—whether in the “classroom” or in research/creativity?

Over the last four years, we have seen a huge swing in the use of XR in the classrooms, whether it be for medical, industrial, storytelling or STEM classes. With increases in technology power and design, it is getting easier to access the technology for consumption and development.

What are some examples of XR activities at Syracuse University? 

Currently, professors are using it to create interactions for researching media bias, discussing industrial design, and architectural processes, along with wellness and content exploration. Some professors are using social VR apps to meet online as avatars in a “virtual physical” space similar to a lecture hall or classroom.

This past summer, I led an XR in Research class where six professors and their teaching assistants learned how to utilize Unreal Engine XR and HP’s new Reverb G2 Omnicept VR system to create XR experiences that measure cognitive load, heart rate and eye tracking. The group will create an educational experience that will be showcased in January 2022.

What do you think is the future of XR in education? What are some ideas or possibilities that you and your XR colleagues are interested in exploring?  

I think the future of XR is growing exponentially, especially in its capabilities for online classes and remote learning to be able to collaborate in real time in a “physical” space. I am currently looking at how to decrease the lift of designing interactions through photogrammetry and volumetric capturing to create realistic spaces to meet in or build for classes.

Makana Chock, David J. Levidow Professor of Communications at the Newhouse School, just received a Facebook Reality Lab grant to explore the impacts of augmented and virtual reality on bystander privacy. Several departments also are looking at how to include social VR and XR development in their curricula to prepare students for future career opportunities.

How would a member of the Syracuse University community (student, faculty or staff) get more involved with XR? Is there a club or a space that focuses on XR? 

As a whole, students and faculty can join our XR at Syracuse group that meets virtually online twice a semester, and they can reach out to me (jmwebb02@syr.edu) to join the group on Blackboard. We also have the extendedreality.syr.edu website that updates information on XR projects on campus.