Prototype for the People

Supporting the development of open-source accessible XR code.

What is Prototype for the People?

“Prototype for the People” aims to fix the lack of accessible open-source solutions for XR by creating a grassroots campaign to accelerate open-source development and prototyping. Developers should not have to reinvent the wheel to create accessible XR applications!

Goals:

  1. Encourage developers to create open-source prototypes showcasing how virtual, augmented, and mixed reality (XR) software can be more accessible for people with disabilities.
  2. Create a library of open-source code that developers can draw from to facilitate making new XR projects more accessible.
  3. Give feedback and suggestions to platform owners about how their platforms can be made more accessible, up to and including new code.

How Does It Work?

  1. Call – XR Access will put out a call for accessible prototypes with our partners in industry, academia, and elsewhere.
  2. Development – Developers will create prototypes, facilitated by feedback from XR Access’ Accessible Development of XR (ADXR) work stream.
  3. Propagation – Successful prototypes are added to the XR Access Github for other creators to reference and shared with platform owners. XR Access may spotlight prototypes and feature their creators in Speaker Series.

Who Can Participate?

Anyone is welcome to participate! That said, our initial efforts will focus on developers who don’t already have firm projects in mind: namely, groups like bootcamp and university students, hackathon participants, and hobbyists.

In addition to developers, we also need involvement from people with disabilities and others who are willing to test out prototypes and give feedback to developers.

If you’re interested, you can join the ADXR Workstream or post to the #ad-xr-development channel on the XR Access Slack.

Example Projects

These open-source projects, designed for the Meta Quest 2 on Unity in collaboration with Equal Entry and the XROS Fellowship, provide reusable code to make implementing accessibility easier for developers.

  • Chirp Captions –  Offers a means of implementing pre-written captions across multiple speakers, with options for font, etc. Developed by Savio Menifer.

  • Locomotion Accessibility Toolkit – Features multiple modes of locomotion such as teleportation, smooth locomotion, and snap turning. Developed by Shivam Sharma.

  • Vision Accessibility Toolkit – Enables the adjustment of visual settings like brightness, contrast, and hue. Developed by Yuvraj Kadale.

Project Suggestions & Resources

There are many unsolved challenges when it comes to XR accessibility. Here are a few we sourced from our community and partners like the W3C Immersive Captions Community Group.  You can find a current list of accessibility resources at our XR Access GitHub; make sure you’re not just reinventing the wheel!

A note on codesign and inclusion: While it’s great to start tinkering as soon as you’ve got an idea, it’s always best to include disabled people as early as possible in your process. Codesigners will help correct your wrong assumptions and give valuable feedback at every step of the project, so don’t wait until your project is done to reach out! You can find a community of people ready and willing to help at the XR Access Slack.

Visual Accessibility

People with visual accessibility needs may benefit from features that make things easier to see or that replace visual cues with audio or haptic feedback. They also benefit people on devices with a limited field of view, and open the door for AI and machine learning applications.

  • Screen readers, like VoiceOver or NVDA, allow blind people to perceive a website as text. How can an XR space be presented as text?
  • How can we assign alt text to 3d objects, like we do to 2d images? How should alt text be read out to users?
  • How can blind people easily tell who is around them in social VR? How can they find their friends in a crowd, or tell when people are looking at them?
  • How can we make it easier to spot specific objects, or let users access objects from a list? (E.g. a browser to select any object in a scene)
  • How can important visual information be represented with audio?
  • How can we make text easier to read in XR?
  • How can we mix ambient audio to let users understand where they are in a space from sound alone?

Visual accessibility references:

Auditory Accessibility

People with auditory accessibility needs benefit from features like captions that can turn audio into other forms of feedback, or that make select audio easier to hear. They also benefit people playing in loud environments and those who are not familiar with the primary language of a space.

  • How should captions work in a social VR environment?
    • How can users differentiate between what’s being said to them vs background conversation?
    • How should users be informed of people behind them talking?
    • How can conversations in VR be kept private? How do users know how far their words can reach, and who can hear them?
    • How can users respond to conversation in ways other than speaking? How can they direct this to one person, a group, or globally?
  • How could D/deaf users be notified of a fire alarm going off while in a headset? What about other outside stimuli?
  • How can we make it easier for sign language users to sign to others, either inside or outside the XR space?
  • How can music and sound effects be represented via vision or haptics?

Auditory accessibility references:

Mobility Accessibility

People with mobility needs benefit from features that enable them to use XR without the need for big movements, fine motor control, one or two hands, or a specific playing pose (like standing up). These also benefit people with a small playing space or dead batteries in their controllers.

  • How can users play with only one controller? No controllers?
  • How can users quickly select options using only their eyes?
  • How can motion controls be remapped onto buttons? How can buttons be remapped to other means of input?
  • How can switch controls like mouth sticks and blow tubes be used in XR?
  • How can brain-computer interfaces be leveraged for XR?

Mobility accessibility references:

Cognitive Accessibility

People with cognitive accessibility needs benefit from features that remove complexity and extra stimulation, and make it easy to get help when required. New users and multitaskers benefit from these features as well.

  • How can we improve the copilot experience for XR?
  • How can users get help contextually, instead of just having to remember from a tutorial or figure it out on their own?
  • How can users get more time to make time-sensitive decisions?
  • How can developers create versions of their world with minimal stimuli, like noise and animated objects?

Cognitive accessibility references:

Do you have a suggestion for a project or resources developers can use to create prototypes? Have you created something and you want to share it? Let us know at info@xraccess.org or share it in the XR Access Slack!