NSF Research Experiences for Undergraduates Program
Making Augmented and Virtual Reality Accessible
Summer 2022 | New York City, NY
This REU Site brings 8 students per year to the campuses of Cornell Tech and Columbia University in NYC to participate in projects that make XR more accessible to people with disabilities.
You’re invited to join us for final presentations from students in the Summer 2022 Accessible XR REU program. Learn more and RSVP at this link >
Applications for the Summer 2022 program are currently closed. Please check back in early Spring 2023 for information about the Summer 2023 program.
About the Program
Augmented and virtual reality (XR) technologies are popular consumer products. But the devices, development and authoring tools, and content that make up the XR ecosystem pose barriers for people with disabilities. How can you present a 3D virtual environment to a person with a visual impairment? How should you provide captions in 3D to someone who is hard-of-hearing? How can a person with a motor control disability navigate a 3D world?
This REU site is led by Shiri Azenkot at Cornell Tech and Brian Smith at Columbia University, both experts in accessibility, along with Steven Feiner, an augmented reality and 3D interaction pioneer. They are recruiting students interested in XR accessibility who are looking to gain experience in the research process—from ideation to presentations—and who would like to join a community of scholars and practitioners who will provide career advice and mentorship.
Students will work on projects at the intersection of accessibility and AR/VR. Examples of possible projects include:
- Enhancing the Accessibility of VR for Blind People with Environment Descriptions: VR applications can be made more accessible to blind people with audio feedback from objects in the environment. How can we make this feedback more useful? In this project, students will prototype and test different types of audio description and feedback that might help blind people navigate VR environments.
- Improving AR Interaction and Control: With head-worn AR devices becoming mainstream, it’s important to understand how these devices can be used to help people interact with everyday virtual content when manual interaction isn’t possible. In this project, students will build on existing work that takes advantage of the precise head position and orientation available with current headsets to create more accessible and useful interaction techniques for AR.
- Blind-Accessible Video Games: We are currently developing a 3D adventure game in the Unity game engine that can only be played with sound. We are looking into granting visually impaired players the ability to freely explore game worlds as they choose through various assistive tools that we are designing. We are planning to run a user study with visually impaired users to test the effectiveness of the tools we create.
In addition to working on research projects, students will participate in other informal learning and social activities. The program will begin with an XR development “boot camp,” which will help students get up to speed on useful techniques in XR content creation. Students will also attend weekly seminars and conversations with invited experts in academia and industry. Students will present their work formally at the midpoint and end of the program. In addition, the mentors will organize social events to build camaraderie, taking advantage of the rich culture of New York City.
Program Dates: June 6–August 5, 2022.
Program Location: New York City. We are planning to operate in-person, but applicants should be aware that the program may move to a virtual format should conditions require.
Stipend: Participants receive a living stipend, and free housing on the campus of Columbia University.
Explore Past Projects
Video games have yet to offer visually impaired players an equivalent experience to what sighted players enjoy. Sighted players use minimaps within games to learn how their surrounding environment is laid out, but there is no effective analogue to the minimap for visually impaired players. A major accessibility challenge is to create a generalized, acoustic (non-visual) version of the minimap for visually impaired players.
To address this challenge, we develop and investigate four acoustic minimap techniques which represent a breadth of ideas for how an acoustic minimap might work: a companion smartphone app, echolocation, a directional scanner, and a simple menu. Each technique is designed to communicate information about the area around the player within a game world, providing functionality analogous to a visual minimap but in acoustic form.
This research was funded by National Science Foundation Awards 2051053 and 2051060.
The opinions, findings, and conclusions, or recommendations expressed are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.