Daniel Braunstein

Audio Programming | Spatial Audio Research

On Accessibility in VR

One of the most fun parts of digging into VR recently has been the “passive gameplay research”, or maybe better phrased as “Playin'’a bunch of VR games”. Beat Saber, in particular, has been a tun of fun for me (as a DDR/Guitar Hero kid and general Rhythm-game enjoyer) and has been a nice exercise-adjacent activity during these mostly stagnant times. It feels like one of the best realizations of the potential for what VR can do, and I have never been so profoundly aware of the accessibility barrier behind how much movement is required to even experience it.

In their op-ed for Scientific American, “Virtual Reality Has an Accessibility Problem” Kaitlin Ugolik Phillips highlights the disparity between the marketing-savvy ‘Virtual Reality is an Empathy Machine’ tagline and the reality that it’s still inaccessible to many users who have any form of limited mobility. Often the range of motion required of the hands and/or head could be limiting enough, but pair it with a preponderance of two-handed controls, simultaneous button presses, and even mobility-requiring room-scale experiences, it quickly spins into too great of an obstacle to overcome. Some games, however, are designed in a way that is less demanding: “cockpit-style” experiences, relying only on the manipulation of the finger controls as you operate a vehicle of sorts, are a fantastic stationary experience that doesn’t require any extraneous limb movement. These experiences for me are also enjoyable because sometimes I just don’t want to have to move or turn around all the time in order to truly experience the medium. But for many potential users, this isn’t a want, it’s a need.

In my own work, largely concerned with binaural sound reproduction and getting more accurate/ plausible/ enjoyable “immersive sound”, I will admit this being a major, consistent, oversight of mine. Intrinsically, “Binaural Sound” requires two equally and properly functioning ears in order to work. Any significant hearing loss in one or both ears will render the effect inert. From a Psychoacoustic perspective, there are a number of ‘monaural’ cues which help inform our brains as to the location of a sound source, but that’s interpreted in tandem with the ‘binaural’, not exclusively independent of. Similarly, anyone with loss of vision in one or both eyes is shut out of the “full experience” of VR.

One potential solution to the auditory issue might be a visual representation of “relevant” sound sources, but in a medium where screen real estate is already at a premium, the visual clutter introduced by this feature would require a ground-up redesign of any user interface. Maybe that’s the point though, and I’ll leave on this note to force myself to consider this little longer: if accessibility concerns require starting from the ground up in order to accommodate them, maybe we need to start from the ground up in the first place. After all, handrails, elevators, and ramps are now included in the blueprints for newly constructed buildings. Maybe it’s time to start including these concerns at the beginning of the design process, rather than as an afterthought.