Research Network2024-01-22T20:10:27+00:00

XR Access Research Network

The XR Access Research Network fosters a diverse community of researchers pursuing academic-style research across fields and disciplines which have a shared interest in making XR technologies accessible, enabling innovative research, new collaborations, resource-sharing, and translational impact.

Join the research network for updates on current research, funding opportunities, research recruitment, and more!

The XR Access Research Network aims to foster academic-style research across fields and disciplines, with a shared interest in making XR technologies accessible, enabling innovative research, new collaborations, resource-sharing, and translational impact.

The Research Network holds regular seminars and conversations, supports emerging researchers at the undergraduate and graduate levels, creates networking and collaboration opportunities among research groups and between academia and industry. We also bridge the gap between researchers and disabled communities.

If you’d like to participate in the Research Network, we’d love to hear from you! You can join as an individual researcher via our application form. If you have any questions, feel free to contact us via email at info@xraccess.org.

If you’d prefer to participate in research directly, we’re always looking for research participants, especially those with disabilities and/or XR experience.

Past Seminars

All are welcome to register for the XR Access Research Network’s Seminar Series. Stay tuned for an announcement of the next seminar.

REU Showcase 2023

August 4th

On Aug 4th, our 2023 Research Experience for Undergraduate cohort presented on their XR accessibility projects. Learn about neurodivergent-friendly VR, Saliency in 360 Videos, and more!

Continue Reading REU Showcase 2023

Augmented Reality Obstacle Avoidance | Dylan Fox

January 31, 2023

How can augmented reality amplify vision to support obstacle navigation for people with limited sight? Join UC Berkeley researcher and XR Access Head of Community and Outreach Dylan Fox to learn about Augmented Reality Obstacle Avoidance. This research project showcases the opportunities and challenges faced in leveraging AR as an assistive technology, such as using digital cues to improve the contrast of physical obstacles and pointing out hazards that users might miss. With AR devices growing ever more popular, this work hints at how they may follow the smartphone in becoming powerhouses of accessible technology.

 

Continue Reading Augmented Reality Obstacle Avoidance | Dylan Fox

Accessible XR Summer Research Program | Final Presentations

August 5, 2022

Each summer, Cornell Tech and Columbia University welcome undergraduates from across the country to their campuses for a NSF-funded Research Experiences for Undergraduates program in accessible XR.

This year, eight students spent the summer learning research methods, building prototypes, and testing new ideas for projects that make XR more accessible to people with disabilities.

You’re invited to view final presentations from the REU project groups on Friday, August 5th at 1pm ET. Students will provide background on their projects and demonstrate the technologies they’ve created. Please join us in congratulating our hard-working students on their successes during this program!

Register for the presentations here >Continue Reading Accessible XR Summer Research Program | Final Presentations

Research Network Projects

ISMAR 2024 – IDEATExR Workshop

CFP: Fourth Workshop on  Inclusion, Diversity, Equity, Accessibility, Transparency and Ethics in XR (IDEATExR) To be held in conjunction with the 2024 International Symposium on Mixed and Augmented Reality October 21-25, 2024 | Greater Seattle Area, Washington, USA Continue Reading ISMAR 2024 – IDEATExR Workshop

AI Scene Descriptions

Authors: Ricky Gonzalez, Jazmin Collins, Shiri Azenkot, Cindy Bennet Cornell Tech PhD Students Ricky Gonzalez and Jazmin Collins, co-founder of XR Access Shiri Azenkot, and Accessibility Researcher at Google Cindy Bennet are investigating the potential of AI to describe scenes to blind and low vision (BLV) people. The team developed an iOS application called Eyevisor that simulated the use of SeeingAI to collect data about why, when, and how BLV people use AI to describe visuals to them. The study's results points to a variety of unique use cases for which BLV users would prefer to [...]

IEEE VR 2024 – IDEATExR Workshop

XR Access will be cosponsoring a workshop on Inclusion, Diversity, Equity, Accessibility, Transparency and Ethics in XR (IDEATExR) in conjunction with the IEEE VR 2024 conference at Orlando, Florida USA, from March 16-21, 2024. Keynote Speaker: Dr. Guo Freeman, Dean’s Associate Professor in Human-Centered Computing at Clemson University, will be presenting her talk “Designing Inclusive and Safe Social VR Systems to Protect Marginalized Communities.” Continue Reading IEEE VR 2024 – IDEATExR Workshop

Sighted Guides to Enhance Accessibility for Blind and Low Vision People in VR

Authors: Jazmin Collins*, Crescentia Jung*, Yeonju Jang (* indicates co-first authorship and equal contribution). Our first project explored using the sighted guide technique in VR to support accessibility for blind and low vision people. We created a prototype with a guidance system and conducted a study with 16 blind and low vision participants. Participants completed various tasks and explored the virtual environments with their guide. From this project, we uncovered participants’ preferences for a guide in VR and a broad design space for virtual guides.  We have begun a second project that explores using an AI-powered guide. We [...]

Exploring Self-Presentation for People with Invisible Disabilities in VR

Authors: Ria J. Gualano*, Lucy Jiang*, Kexin Zhang*, Andrea Stevenson Won, and Shiri Azenkot (* indicates co-first authorship and equal contributions). Through this project, we aim to fill a gap in current accessibility research by focusing on the experiences of people with invisible disabilities (e.g., chronic health conditions, mental health conditions, neurodivergence, etc.) and preferences regarding disability disclosure practices in virtual embodied spaces. We conducted semi-structured interviews with 9 participants, some who identified as neurodiverse and others who identified as having chronic conditions such as rheumatoid arthritis. Of the 9 participants, 4 were Caucasian, 2 were Mixed-Race, 1 was [...]

Making 360° Videos Accessible to Blind and Low Vision People

Authors: Lucy Jiang, Mahika Phutane, Shiri Azenkot. While traditional videos are typically made accessible with audio description (AD), we lack understanding on how to make 360° videos accessible while preserving their immersive nature. Through individual interviews and collaborative design workshops, we explore ways to improve 360° video accessibility beyond applying traditional AD methods. We conducted design workshops with BLV and sighted AD creators who collaboratively develop accessible 360° video prototypes.  We found that AD experts value accurate vocabulary and different points of view for immersive descriptions, prefer a variety of presentation locations for spatialized AD, appreciate sound effects [...]

Making Nonverbal Cues Accessible to Facilitate Interpersonal Interactions in VR

Authors: Crescentia Jung*, Jazmin Collins*, Yeonju Jang, Jonathan Segal (* indicates co-first authorship and equal contribution). This project explores how to make nonverbal cues accessible in VR for blind and low vision people. We first explored how to make gaze accessible in VR with a blind co-designer by co-creating a highly customizable prototype that uses audio and haptic feedback. From these initial findings, we have started to explore additional nonverbal cues such as nodding, shaking one’s head, smiling, and frowning.  We plan to develop prototypes with these additional cues and conduct a formative design study with blind and [...]

Guide to Distributing Tools for VR Accessibility Accommodations

Authors: Jonathan Segal, Samuel Rodriquez, Akshaya Raghavan, Heysil Baez, Shiri Azenkot, Andrea Stevenson Won. The distribution of tools on improving the accessibility of VR is currently fragmented. Companies usually share these frameworks either on their website and academics if they do publish these tools usually do so inconsistently. This leads to difficulty when integrating these tools into other projects. When timelines are tight, especially if the project’s main focus is not accessibility, it is easy to skip adding accommodations to a project. This project is advocating for a standard method of sharing such tools focusing on the Unity [...]

Virtual Showdown

VR games are becoming common, but people with visual impairments are frequently left out of the fun. Some prior work has explored including audio cues as an accompaniment to visual VR experiences, while developers of accessible games have created audio-only versions of popular games for people with visual impairments. However, there’s very little academic research on accessible VR games, pointing to an opportunity to explore how to convey information about moving objects, and how to teach players who may never have played a non-virtual version of a game how to interact with virtual objects. Dr. Kyle Rector and her [...]

AR Navigation

Navigating stairs can be a dangerous mobility challenge for people with low vision. Inadequate handrails, poorly marked steps, and other obstacles can reduce mobility and lead to accidents. While past research has proposed audio stair-navigation aids for blind people, no research on people with low vision has yet addressed this challenge. Cornell Tech PhD student Yuhang Zhao, along with Cx faculty member Shiri Azenkot and other researchers from Cornell Tech and Columbia University, built on this research to design several AR navigation systems for people with low vision. They tested both projection-based and optical see-through smart glasses systems [...]

Interactive 3D Models

Cornell Tech PhD student Lei Shi, faculty member Shiri Azenkot, and their collaborators are studying how to design educational 3D models for students with visual impairments. The researchers interviewed teachers of the visually impaired about the needs of their students, and demonstrated previously designed 3D-printed interactive models and computer vision-based educational tools. Based on interview results, they designed an augmented reality (AR) app that pairs with a 3D printed model. The model’s tactile features help students learn the physical features of a concept, while visual and audio cues from the AR app provide additional information. The study’s results [...]

Go to Top