VR games are becoming common, but people with visual impairments are frequently left out of the fun. Some prior work has explored including audio cues as an accompaniment to visual VR experiences, while developers of accessible games have created audio-only versions of popular games for people with visual impairments. However, there’s very little academic research on accessible VR games, pointing to an opportunity to explore how to convey information about moving objects, and how to teach players who may never have played a non-virtual version of a game how to interact with virtual objects.

Dr. Kyle Rector and her research team at University of Iowa took up this challenge. They built on experiences with designing accessible exercise experiences for people with low vision to create a 3D audio version of the game Showdown, a tabletop sport invented by visually impaired athletes, in which players compete to hit a ball into their opponent’s goal while wearing protective eyewear that blocks all vision. They implemented a proof of concept “Virtual Showdown” using a Microsoft Kinect to track the position of the player’s body, a virtual table and ball implemented in Unity 3D, a Nintendo Switch Joy-Con controller to provide haptic feedback, and headphones to provide spatial audio to the player.

To test their setup, the researchers had participants – all youth with visual impairments – attempt to hit a virtual ball into their opponent’s goal, earning points for goals and effective hits. The team also tested different types of scaffolding support, instructions for players that help them build their Virtual Showdown skills little by little without overwhelming them with a new environment. Participants were given scaffolds before, during, and after the ball was deployed toward them, telling them the location and trajectory of the ball. Each participant tested both types of instruction: a verbal scaffold in the form of a spatial audio instruction, and a verbal scaffold combined with vibration from the Joy-Con controller. The researchers collected data on how successful the participants were at hitting the virtual ball, and analyzed video footage of participants’ body movement strategies to determine the effectiveness of each type of scaffold.

Results showed that players, especially those who had played real-world Showdown before, scored higher with just the verbal scaffold support than they did with the combined verbal and vibration scaffold. While players reported enjoying the game overall, there was no significant difference in which type of scaffold they preferred. These results demonstrate the importance of effective instruction for virtual interaction: participants were able to hit the ball more effectively when the virtual experience was closest to the “real life” experience. For instance, some players stated that the vibrating controller was confusing, possibly because it didn’t match the real-life experience of playing Showdown.

So what does Virtual Showdown tell us about how to design more accessible virtual games? Dr Rector recommends that designers and developers keep learning in mind when creating games and virtual environments. People who are entirely new to a game don’t always know how to move their bodies effectively, and even those who are familiar with the real-life version of a game need time to build up their virtual interaction skills. Creating virtual interaction techniques that mimic real-life interactions can help for users with more expertise, and learning methods such as scaffolds can help less-experienced users get up to speed. These types of development techniques don’t make virtual gameplay easier for players with disabilities – they make them more fun for everyone.

Want to read more about this research? You can find the paper at this link >

projection-based AR

Pictured: People playing the (non-virtual) game Showdown.