Exploring the different approaches and techniques to designing XR, conducting and sharing research to inform how the field develops new technologies that are accessible to people with disabilities.
At Cornell Tech, Professor Shiri Azenkot’s research lab is committed to developing technology that empowers people with disabilities. Prof. Azenkot and her students have been developing XR applications to help people with visual impairments with daily tasks, like navigating stairs and locating products in a supermarket.
Navigating stairs can be a dangerous mobility challenge for people with low vision. Inadequate handrails, poorly marked steps, and other obstacles can reduce mobility and lead to accidents. While past research has proposed audio stair-navigation aids for blind people, no research on people with low vision has yet addressed this challenge. Cornell Tech PhD student Yuhang Zhao, along with Cx faculty member Shiri Azenkot and other researchers from Cornell Tech and Columbia University, built on this research to design several AR navigation systems for people with low vision. They tested both projection-based and optical see-through smart glasses systems to determine the most effective navigation techniques.
In the projection-based AR systems, experimental participants carried a handheld projector that projected a variety of different animated highlights on stairs, as well as playing auditory feedback. Results showed that participants appreciated highlights that marked the end of a staircase. While some appreciated moving highlights that attracted attention, others found these animations distracting. Participants using the optical see-through smart classes were shown a glow effect on the display to demonstrate their progress on the staircase, along with an animation showing a virtual path. Most participants found the glow animation helpful, and some appreciated that the path graphic gave them a clear indication of the direction of the stairs. In addition to ease of navigation concerns, both types of AR systems reduced participants’ walking time. Participants also reported feeling more psychologically secure when navigating using both types of AR. These results advance understanding of the benefits of different types of graphics and devices that can assist people with low vision with navigating stairs. The results can also be applied to assisting with other types of navigation, such as in unfamiliar settings. This project was presented at ACM UIST 2019 in New Orleans, LA. Click here to read the full paper.
Pictured: Our visualizations for (a) projection-based AR and (b) smartglasses to facilitate stair navigation for PLV
Interactive 3D Models
Cornell Tech PhD student Lei Shi, faculty member Shiri Azenkot, and their collaborators are studying how to design educational 3D models for students with visual impairments. The researchers interviewed teachers of the visually impaired about the needs of their students, and demonstrated previously designed 3D-printed interactive models and computer vision-based educational tools.
Based on interview results, they designed an augmented reality (AR) app that pairs with a 3D printed model. The model’s tactile features help students learn the physical features of a concept, while visual and audio cues from the AR app provide additional information.
The study’s results point to several possible avenues for improved educational technology, including adding effective tactile features, providing both auditory and visual content, and building from existing pedagogical best practices.
Pictured: A visually impaired student uses a mobile application to learn about the Plane model