Explore Accessible Spatial Interactions

By Cristian Díaz

Elevator Pitch

Every major Apple platform was driven by an innovative input model. The incredibly intuitive gaze-and-pinch interaction of Vision Pro makes it seem inaccessible to those with disabilities. My talk argues that AVP could become one of the most advanced assistive technologies.

Description

In approximately thirty minutes, I’ll contrast and contextualize Apple’s claim that Apple Vision Pro offers “The largest list of accessibility features we’ve ever included in the first generation of a product”. This is followed by a brief discussion of some of the accessibility features available on the Vision Pro operating system (visionOS). Will examine the variety of input possibilities available and the innovative way in which this device may switch between different interaction modes. Interweaving with some information on the active, constant efforts to make it accessible for more than a decade by developing layer after layer of the stack. Then, building on that, I will show a new set of patterns that are emerging for making spatial interactions accessible. Following this are recommendations for ideation frameworks and a live demo on the device that adheres to those principles. Finally, I conclude with a call to action and some discussion on what to do to start and/or incorporate accessibility into current pipelines.

Notes

I’ve been iterating this talk for more than two years, with a considerably more practical focus since the AVP’s release. In general, the only requirement I have is having the capacity to output sound so that the audience can experience VoiceOver firsthand, and preferably, the venue should have captions support. Aside from that, you can find some early versions of my talk on my website