Nov 12th, 2025
Ever wondered how blind gamers wave hello, give a thumbs up, or show they’re excited in online games? Join us as entrepreneur, researcher, and designer Brandon Biggs shares his research on how techniques from top audio game developers can be applied to make VR more accessible for blind and low-vision users and better for everyone.
Event Details
Date: Wed Nov 12th
Time: 10am PT | 1pm ET
Location: Zoom
This talk will present a framework described in the paper “Creating Non-Visual Non-Verbal Social Interactions in Virtual Reality” for making Virtual Reality (VR) fully accessible to blind and low-vision individuals. By leveraging a Delphi method with top audio game developers, the research translates commercially-tested, non-visual conventions for 29 non-verbal social interactions, covering categories like movement, emotes, and appearance, into practical design patterns for VR. The core focus is on utilizing techniques such as spatial audio (HRTF) for proximity and location, dedicated auditory cues for movement and collisions, and screen reader integration to convey rich emotional and appearance information. This approach offers developers an immediate, tested baseline for accessibility, moving past the visual-centric limitations of current VR to create a genuinely inclusive social VR experience.
If you require accessibility accommodations such as American Sign Language interpretation, please email info@xraccess.org no fewer than 72 hours before the event.
About the Speaker
Brandon Biggs
Georgia Institute of Technology PhD Candidate and XR Navigation CEO
Brandon Biggs is an entrepreneur, researcher, and inclusive designer. He is CEO of XR Navigation, an engineer at Smith-Kettlewell Eye Research Institute, and a PhD student at Georgia Tech. Nearly blind from Lebers Congenital Amaurosis, he develops human-centered tools tackling challenges in the blindness field.
