August 9, 2022

Completed as part of the MIT 2022 Reality Hack, “Inclusive User Testing in VR” is an accessible in-game tool that allows general audiences to give feedback on VR projects. Users can use a variety of input methods, including voice-to-text and VR keyboards, and also includes screen reader support for low-vision individuals. This functionality allows researchers, developers, and designers to receive immediate, asynchronous feedback on their VR projects via a web-based dashboard. In this talk, we will cover not only the value that this tool brings to a diverse number of XR creators, but how this hackathon project went from an initial concept to a usable tool over the course of many development iterations and direct involvement from the accessibility community. Register for the talk here.

About the Speakers

Hung Nguyen is a VR/AR Software Engineer interested in using technology as a medium to create serious games. Currently, Hung is at a startup designing XR interactions and integrations.

Nhan Tran currently works on robotic perception at Robust.ai. He will soon begin a PhD at Cornell Tech to research XR interfaces for human-robot interaction and other technologies that support people in their daily lives.

Daisy Chen is a XR interaction designer and prototyper based in Bay area. Daisy conducted researches in VR live-streaming and how to using VR to facilitate collaboration. Besides, she also have interests in the intersection of VR, accessibility and edTech.

Jayden Kim is an AR user experience designer and prototyper. Jayden focuses on digital accessibility and inclusive design, especially for the visually impaired users.

Tejas Shroff: After shifting career paths from business to programming, Tejas has developed XR projects for small startups and fortune 500 companies. He now works as a software engineer at absurd:joy to develop desktop software that better enables remote work.

Watch the Talk