There are many unsolved challenges when it comes to XR accessibility. Here are a few we sourced from our community and partners like the W3C Immersive Captions Community Group. You can find a current list of accessibility resources at our XR Access GitHub; make sure you’re not just reinventing the wheel!
A note on codesign and inclusion: While it’s great to start tinkering as soon as you’ve got an idea, it’s always best to include disabled people as early as possible in your process. Codesigners will help correct your wrong assumptions and give valuable feedback at every step of the project, so don’t wait until your project is done to reach out! You can find a community of people ready and willing to help at the XR Access Slack.
People with visual accessibility needs may benefit from features that make things easier to see or that replace visual cues with audio or haptic feedback. They also benefit people on devices with a limited field of view, and open the door for AI and machine learning applications.
- Screen readers, like VoiceOver or NVDA, allow blind people to perceive a website as text. How can an XR space be presented as text?
- How can we assign alt text to 3d objects, like we do to 2d images? How should alt text be read out to users?
- How can blind people easily tell who is around them in social VR? How can they find their friends in a crowd, or tell when people are looking at them?
- How can we make it easier to spot specific objects, or let users access objects from a list? (E.g. a browser to select any object in a scene)
- How can important visual information be represented with audio?
- How can we make text easier to read in XR?
- How can we mix ambient audio to let users understand where they are in a space from sound alone?
Visual accessibility references:
People with auditory accessibility needs benefit from features like captions that can turn audio into other forms of feedback, or that make select audio easier to hear. They also benefit people playing in loud environments and those who are not familiar with the primary language of a space.
- How should captions work in a social VR environment?
- How can users differentiate between what’s being said to them vs background conversation?
- How should users be informed of people behind them talking?
- How can conversations in VR be kept private? How do users know how far their words can reach, and who can hear them?
- How can users respond to conversation in ways other than speaking? How can they direct this to one person, a group, or globally?
- How could D/deaf users be notified of a fire alarm going off while in a headset? What about other outside stimuli?
- How can we make it easier for sign language users to sign to others, either inside or outside the XR space?
- How can music and sound effects be represented via vision or haptics?
Auditory accessibility references:
People with mobility needs benefit from features that enable them to use XR without the need for big movements, fine motor control, one or two hands, or a specific playing pose (like standing up). These also benefit people with a small playing space or dead batteries in their controllers.
- How can users play with only one controller? No controllers?
- How can users quickly select options using only their eyes?
- How can motion controls be remapped onto buttons? How can buttons be remapped to other means of input?
- How can switch controls like mouth sticks and blow tubes be used in XR?
- How can brain-computer interfaces be leveraged for XR?
Mobility accessibility references:
People with cognitive accessibility needs benefit from features that remove complexity and extra stimulation, and make it easy to get help when required. New users and multitaskers benefit from these features as well.
- How can we improve the copilot experience for XR?
- How can users get help contextually, instead of just having to remember from a tutorial or figure it out on their own?
- How can users get more time to make time-sensitive decisions?
- How can developers create versions of their world with minimal stimuli, like noise and animated objects?
Cognitive accessibility references:
Do you have a suggestion for a project or resources developers can use to create prototypes? Have you created something and you want to share it? Let us know at email@example.com or share it in the XR Access Slack!