XR Access Symposium
Connecting the XR Access Community
XR Access 2023: Seizing Advancements for Access
Read our 2023 Symposium Report! We also have a plain-text version of the report available.
Venue
Verizon Executive Education Center at Cornell Tech
2 West Loop Road
New York, NY 10044
The Verizon Executive Education Center at Cornell Tech redefines the landscape of executive event and conference space in New York City. This airy, modern, full-service venue offers convenience and ample space—with breathtaking views—for conferences, executive programs, receptions, seminars, meet-ups and more. Designed by international architecture firm Snøhetta, the Verizon Executive Education Center blends high design and human-centered technology to bring you an advanced meeting space suited for visionary thinkers.
Speakers
Schedule
June 15th
9 am – 12 pm: Main Stage Presentations
- 9:00 | Introduction
- 9:10 | The Guide Has Your Back: Exploring How Sighted Guides Can Enhance Accessibility in Social Virtual Reality for Blind and Low Vision People
- 9:50 | Empowering the Workforce through Accessible XR
- 10:30 | Coffee Break
- 10:40 | The Virtual Experience Research Accelerator (VERA)
- 11:20 | Loud and Clear: Improving Accessibility for Low Vision Players in Cosmonious High
12 pm – 2 pm: Lunch & Demos
2 pm – 5 pm: Breakout Sessions
- 2:00 | Introduction to Breakouts
- 2:20 | Breakout A
- 3:10 | Break
- 3:30 | Breakout B
- 4:20 | Reporting & Discussion
June 16th
9 am -1 pm: Main Stage Presentations
- 9:00 | Introduction
- 9:10 | SocialSense XR: Making the Invisible Visible
- 9:50 | Building an Accessible and Inclusive Metaverse
- 10:30 | Annotated Lip Reading for Augmented Educational Systems
- 11:10 | Coffee Break
- 11:20 | Multidimensional Computing Accessibility in the age of XR and AI
- 12:00 | Vision Accessibility with AR+AI Tools
- 12:40 | Closing Statements
Speaking Sessions
These will be presented on the main stage and broadcast online.
June 15th
The Guide Has Your Back: Exploring How Sighted Guides Can Enhance Accessibility in Social Virtual Reality for Blind and Low Vision People
Jazmin Collins and Crescentia Jung
An overview of a research study that explored the potential of sighted guides in virtual reality as a versatile assistive technology for blind and low vision users. We will present on early findings from this study, including the broad design space for virtual guides that it opens.
Empowering the Workforce through Accessible XR
Liz Hyman (moderator), Joel Ward, Corinne Weible, Christian Vogler
Join us for an engaging discussion on XR Accessibility in the workforce, where experts from various fields will explore the importance of creating inclusive immersive experiences to enhance professional settings and promote professional equity. During the session, the panelists will delve into inclusive design principles, understanding the importance of accessibility in XR, and share case studies of how organizations are implementing accessible XR solutions in the workforce. Join us to explore the transformative potential of XR technologies in creating a more accessible and inclusive workplace for all.
The Virtual Experience Research Accelerator (VERA)
Greg Welch (moderator), Christine Hemphill, Tabitha Peck, and Jonathan Beever
Supported by five collaborative grants from the National Science Foundation, the Virtual Experience Research Accelerator (VERA) project will develop and operate CISE community research infrastructure in the form of a human-machine system for carrying out human subjects research related to extended reality (XR), including Virtual Reality and Augmented Reality. This panel primarily seeks to gather wisdom, perspectives, and concrete ideas from the XR Access community related to multiple components of VERA, to help us make all aspects as accessible as we can, from the very beginning.
Loud and Clear: Improving Accessibility for Low Vision Players in Cosmonious High
Jazmin Cano, Peter Galbraith
In this session Owlchemy Labs’ Accessibility Product Manager Jazmin Cano and Senior Accessibility Engineer Peter Galbraith will describe the approach they took adding blind and low vision accessibility to Owlchemy Labs’ most recently released game, Cosmonious High. They will share what they learned throughout this process, from early research and development to the feedback received after release. Audience members will gain an understanding of what is required to create helpful accessibility features for complex and immersive virtual environments.
June 16th
SocialSense XR: Making the Invisible Visible
Dylan Fox, Shiri Azenkot, Andrea Won, Charles La Pierre, Sean Dougherty
Interpersonal communication is not accessible in extended reality (XR) for blind and low vision people. In this NSF-funded project, Cornell Tech, Benetech, and Lighthouse for the Blind have teamed up to design a solution. SocialSense XR makes nonverbal communications and documents perceivable using audio and haptic cues, focusing on virtual reality workplace scenarios.
Building An Accessible & Inclusive Metaverse
Jamie Bykov-Brett
Web 1 & 2 products, services have largely been designed around people who are already most able to participate. So much so that we have created inaccessibility and inequality because there hasn’t been done enough to anticipate the attitudinal, financial, social, enablement, or policy barriers that marginalized groups face. This presentation aims to give you useful frameworks, models and perspectives that should be considered throughout the development of metaverse and Web3 in the pursuit of a kinder, ethical, more inclusive, digitally-enabled future.
Annotated Lip Reading for Augmented Educational Systems
Hrishikesh (Hrishi) Dilip Mulay
This project focuses on the emerging technologies (Augmented and Virtual Reality – AR/VR) to identify the impact of an Automated Lip Reading System within XR Educational Systems. It considers embedding a 3D avatar within an augmented educational system, with the capacity of semi-supervised learning, to generate and manipulate fabricated data from the real input data. The aim is to identify how AR or VR (as an alternative) can assist with the collection of data through an immersive environment.
Multidimensional computing accessibility in the age of XR and AI
Liv Erickson
The interplay between XR technologies and AI is challenging us to think about new ways to engage with, understand, and design systems that utilize massive amounts of information across many dimensions. New interfaces and spatial computing paradigms will require us to broaden our tool sets for developing new applications and presenting information in flexible, dynamic ways. In this talk, Liv will present several case studies that highlight the dynamic nature of application development today to encourage discussion and generate possible solutions to working with many dimensions and forms of information across increasingly complex interfaces.
Vision Accessibility with AR+AI Tools
Sean Dougherty, Jeffrey Colon
Explore the leading AR and AI-based tools that support blind and low vision accessibility on mobile devices, and better understand how the features and use cases for these tools can help users increase access to visual information.
Breakout Sessions
These breakout sessions will enable conference attendees to participate directly in conversations about how to improve XR accessibility. They will not be broadcast due to technological limitations of the event space; however, summaries will be made available in the Symposium Report.
Each breakout slot will be 50 minutes. Attendees can choose one breakout from session A and one from session B to attend. Breakouts are on June 15th.
Breakout Session A
- 360° Video Descriptions – Lucy Jiang
- Captions – Michael Cooper
- Spatial Computing & Data – Liv Erickson
- Policy – Elizabeth Hyman
- Customization – Jamie Bykov-Brett
Breakout Session B
- User Research – Greg Welch
- Standards – Will Schell
- Exploring Accessible VR for Blind Users – Sean Dougherty, Jeffrey Colon
- Inclusive Avatars – Ria Gualano
- Accessible Career Development – Mark Steelman
FAQs
How do I get to the venue?
Public transportation by subway, ferry, or tram is recommended. See this transportation guide for details.
Will the main stage presentations be recorded?
Yes, they will be streamed live via Zoom and added to the XR Access YouTube channel after the conference.
No, unfortunately our recording equipment is not suited to capturing multiple small groups. However, the takeaways will be included in the Symposium Report.
Unfortunately, events in the physical world are expensive; the Symposium will cost tens of thousands of dollars. However, if the expense is a hardship for you, please apply for a scholarship using the link in the description. Note that registering for Zoom and watching online is still free.
Sponsors
XR Access Symposia can’t happen without the support of our generous sponsors.
To learn more about sponsorship opportunities for the 2023 Symposium and other XR Access events, please contact us at info@xraccess.org.