As a highly visual medium, VR presents a challenge to developers committed to creating inclusive, accessible content. Accessible VR is a challenge that Microsoft has embraced. The company just released a suite of innovative emerging technologies that could provide VR access to people with visual impairments, including those who are completely blind.
SeeingVR expands VR access to people with limited vision
A Microsoft “Ability” research team presented the SeeingVR set of tools at the CHI 2019 conference in May. These 14 tools “enhance a VR application for people with low vision by providing visual and audio augmentation,” according to the team’s paper.
Nine of the SeeingVR tools can be overlaid on any Unity-based VR application, providing improved access to entertainment, eLearning, and other immersive experiences. The remaining five tools require development integration using a Unity toolkit, which is available on GitHub.
The goal was “to improve the accessibility of mainstream VR for people with low vision instead of creating specialized VR experiences,” the team wrote, citing lower cost, greater availability, and improved social acceptability of conventional VR apps.
The team first tested four VR experiences with a user group whose members had a broad variety of visual limitations. People with low vision have significant vision loss that cannot be corrected with glasses, but it can vary widely and include issues with color, contrast, peripheral vision, distance, and more. No single adaptation can meet all low-vision users’ needs, so the team developed a flexible, user-controlled set of tools.
Add audio and visual feedback to VR environment
SeeingVR includes nine ways for users to adapt content with visual and/or audio feedback. These tools do not require any built-in support or changes in code:
- Magnification lens: Users can magnify the central part of the screen’s visual field by up to 10x.
- Bifocal lens: Users can magnify part of the screen, such as text, while retaining the virtual scene.
- Brightness lens: Users can adjust the brightness of a virtual scene.
- Contrast lens: When activated, the lens enhances the difference in brightness between objects in the scene.
- Edge enhancement: When activated, this tool increases contrast by adding bright edges to objects.
- Peripheral remapping: When activated, this tool projects an overlay into the user’s visual field, showing the contours of the entire scene. This provides information that is outside of the visual field of a person with peripheral vision loss.
- Text augmentation: Users can change text color and contrast with background, increase text size, and make text bold. This tool also converts all text to an easy-to-read font.
- Text to speech: Users can point a virtual laser to text to have it read aloud.
- Depth measurement tool: Aids users in determining the distance of an object in the virtual scene using a virtual laser with a ball at the end; the length of the laser trail indicates the distance. The color of the laser trail is adjustable, depending on the background color and the user’s preferences.
These nine tools will be available to users as a plugin that can augment any Unity-based VR app.
Unity toolkit enables built-in usability enhancement
Five additional tools work only if the developer has included support, via the open-source Unity toolkit Microsoft developed:
- Object recognition: Similar to alt text on images, if the developer has included a description, the tool will provide an audio description when the user points to the object.
- Highlight: If the developer has identified key objects in a scene, the highlight tool adds colored contours around them to attract the user’s attention.
- Guideline: If an identified important object is outside the user’s field of vision, the guideline tool draws a line from the center of the vision field to the item, including curving behind the user to point to an object that is behind them; the user can follow the line to find the item.
- Recoloring: If the developer has identified significant objects, the recoloring tool ensures that key items close to each other are colored with contrasting colors; it also simplifies complex textures.
- Use with Assistive Apps: Facsimiles of two popular assistive apps, SeeingAI and VizWiz, were tested as SeeingVR tools; the Seeing VR tool would send a screen capture of the virtual scene to the app. SeeingAI verbally describes the scene, and the VizWiz app sends a screen capture, along with the user’s question, to a human who provides a response in real time.
Building in support for accessibility tools makes software, whether eLearning or entertainment focused, more usable for many users. “People zoom in on things all the time on their phones— they aren’t blind, but they need to do it if they want to learn something. If we take that control away from the user, or don’t design for it, we’re taking away the opportunity to learn,” Brian Dusablon told Jane Bozarth in a recent Guild research report, Creating Accessible eLearning: Practitioner Perspectives.
All of the tools are controllable by the users, using voice or menu controls. Users can select colors, font sizes, position of lenses, and more, to customize their VR experience. See a short demo of the tools on this YouTube video.
Testing with users who have varied visual limitations improved the testers’ experience and enabled them to more quickly and easily perform common VR tasks such as shooting at targets, picking up items, and choosing menu items. Some reported that the tools made the VR environment more friendly and enjoyable, though some commented that some of the tools might alter the aesthetics enough to reduce the immersive feeling.
Explore VR
Virtual reality is becoming a feasible platform for more training and performance support uses. Explore the use of VR for training and higher education at 2019 Realities360 Conference & Expo, June 25–27, in San Jose, California. This eLearning Guild event offers plenty of hands-on experience and opportunities to talk with people who are building and using augmented and virtual reality in their L&D work.