NSF Research Experiences for Undergraduates Program

Making Augmented and Virtual Reality Accessible

Summer 2023 | New York City, NY

Please note XR Access will not be hosting an REU in 2024. Make sure to sign up for our Newsletter or join our Slack to hear about REU updates.

About the Program

This REU Site brings 8 students per year to the campuses of Cornell Tech and Columbia University in NYC to participate in projects that make XR more accessible to people with disabilities.

Augmented and virtual reality (XR) technologies have become popular consumer products. But the devices, development and authoring tools, and content that make up the XR ecosystem pose barriers for people with disabilities. How can you present a 3D virtual environment to a person with a visual impairment? How should you provide captions in 3D to someone who is hard-of-hearing? How can a person with a motor control disability navigate a 3D world?

This REU site is led by Shiri Azenkot at Cornell Tech and Brian Smith at Columbia University, both experts in accessibility, along with Steven Feiner, an augmented reality and 3D interaction pioneer. They are recruiting students interested in XR accessibility who are looking to gain experience in the research process—from ideation to presentations—and who would like to join a community of scholars and practitioners who will provide career advice and mentorship.

Program Dates: June 5–August 4, 2023.

Program Location: New York City. We are planning to operate in-person, but applicants should be aware that the program may move to a virtual format should conditions require.

Stipend: Participants receive a living stipend, and free housing on the campus of Columbia University.

What You’ll Do

Students will work on projects at the intersection of accessibility and AR/VR. Examples of possible projects include:

  • Enhancing the Accessibility of VR for Blind People with Environment Descriptions: VR applications can be made more accessible to blind people with audio feedback from objects in the environment. How can we make this feedback more useful? In this project, students will prototype and test different types of audio description and feedback that might help blind people navigate VR environments.
  • Improving AR Interaction and Control: With head-worn AR devices becoming mainstream, it’s important to understand how these devices can be used to help people interact with everyday virtual content when manual interaction isn’t possible. In this project, students will build on existing work that takes advantage of the precise head position and orientation available with current headsets  to create more accessible and useful interaction techniques for AR.
  • Blind-Accessible Video Games: We are currently developing a 3D adventure game in the Unity game engine that can only be played with sound. We are looking into granting visually impaired players the ability to freely explore game worlds as they choose through various assistive tools that we are designing. We are planning to run a user study with visually impaired users to test the effectiveness of the tools we create.

In addition to working on research projects, students will participate in other informal learning and social activities. The program will begin with an XR development “boot camp,” which will help students get up to speed on useful techniques in XR content creation. Students will also attend weekly seminars and conversations with invited experts in academia and industry. Students will present their work formally at the midpoint and end of the program. In addition, the mentors will organize social events to build camaraderie, taking advantage of the rich culture of New York City.

Eligibility Requirements

To participate in this program, students must:

  • Be undergraduates in good standing (including incoming undergraduates)
  • Be US citizens or permanent residents
  • Complete the 9-week program and all requirements, including project reports, presentations, and attending seminars.

Students with disabilities and members of other under-represented groups, including community or 2-year college students, are especially encouraged to apply. We ask that applicants submit a cover letter describing their research interests (including past experience, what they hope to gain from research), career goals, and which of the listed projects they are interested in.

How to Apply

Application materials should be submitted via NSF ETAP.

The application will request the following information about you:

  • Basic information about you
  • Demographic information about you
  • Information about your education and university enrollment
  • A current resume or CV
  • A personal statement describing your interests, experience, career goals, why you’d like to do research, what you’d like to learn, and specific projects you’re interested in (of those listed).
  • Name and contact information for one person who can provide a letter of recommendation

We are no longer accepting applications.

Explore Past Projects

Acoustic Minimaps

Video games have yet to offer visually impaired players an equivalent experience to what sighted players enjoy. Sighted players use minimaps within games to learn how their surrounding environment is laid out, but there is no effective analogue to the minimap for visually impaired players. A major accessibility challenge is to create a generalized, acoustic (non-visual) version of the minimap for visually impaired players.

To address this challenge, we develop and investigate four acoustic minimap techniques which represent a breadth of ideas for how an acoustic minimap might work: a companion smartphone app, echolocation, a directional scanner, and a simple menu. Each technique is designed to communicate information about the area around the player within a game world, providing functionality analogous to a visual minimap but in acoustic form.

Depictions of our four acoustic minimap techniques

Depictions of our four acoustic minimap techniques: a companion smartphone app, echolocation, a directional scanner, and a simple menu of points-of-interest.

This research was funded  by National Science Foundation Awards 2051053 and 2051060.

The opinions, findings, and conclusions, or recommendations expressed are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.