Cydalion exemplifies innovation: The new augmented reality app from Float departs from the common understanding of augmented reality, implements a new understanding of mobile eLearning-based performance support—and makes navigating through an unfamiliar environment a whole lot easier for millions of people who are blind or have low vision.
The pioneering app uses sound and vibration to augment users’ understanding of their environment and inform them of obstacles. Cydalion is the namesake of a character from Greek mythology, Cedalion, who stood on the shoulders of the blind hunter, Orion, to guide him.
The term augmented reality, or AR, often refers to a visual overlay of information on top of an individual’s view of the world; many people first heard about AR when the mobile game Pokémon Go exploded into popularity in July 2016. But augmented reality does not have to be visual.
Guild Master Chad Udell, managing director at Float and a member of the team that designed and developed Cydalion, offers an alternative definition of augmented reality: “All it needs to do is take in information about the real world and then add an overlay and display it in a different and new way.” Cydalion does exactly that. “The display is actually an audio overlay,” he said.
About 285 million people worldwide—including 6.8 million Americans—have a vision-related disability, according to 2014 data from the US Census and the World Health Organization. Less than 2 percent of American adults who are blind use guide dogs—Guiding Eyes for the Blind, in Yorktown Heights, New York, estimates that there are about 10,000 guide dog teams working in the US—meaning that millions of people in the US alone navigate using a cane or other assistive device. One drawback of a cane is that it doesn’t detect overhead obstacles, like a tree branch or a low-hanging light fixture, putting the user at risk of bumping into these obstacles. Cydalion aids users in detecting overhead obstacles as well as items that they might trip over or crash into, enabling them to move more freely through the world. When paired with bone-conduction or standard headphones that allow users to simultaneously hear Cydalion’s feedback and monitor environmental sound, Cydalion can unobtrusively increase users’ safety and confidence as they traverse crowds, traffic, and other constantly changing environments.
Cydalion runs on a Tango-compatible device, ideally worn or held at chest level. (Currently, the only Tango-compatible consumer device is the Lenovo Phab 2 Pro smartphone. A Tango developer kit is also available. Cydalion is available for purchase in the Google Play Store.)
Navigating by ear
Cydalion works with Google’s Tango technology, which uses a combination of hardware and software to map and “visualize” the user’s environment. Tango devices use multiple cameras and sensors, including a wide-angle “fisheye” camera and a depth camera, to get accurate images of three-dimensional objects and distinguish items from their backgrounds.
“[Tango has] additional software hooks to tap into new or different sensors that are on board these devices: some additional cameras; sensor technologies including infrared, depth perception; a really nice wide-angle fisheye lens; and a new and improved inertial measurement unit (IMU) chip, so it has really good, precise position data,” Udell said. “So the tablets, smartphones, etc., that are enabled with Tango sensors are much more aware of the environment that they operate in than the traditional devices.”
The sensors and cameras gather data points from around any object they detect, using a process called computer vision. Computer vision does not rely on GPS or any external signal to determine the device’s position relative to other objects in the environment. Instead, Tango uses cameras and sensors to bounce infrared beams off of objects and produce three-dimensional images of an object or environment. Computer vision is used in autonomous vehicles, medical imaging—and now, as a navigational aid for low-vision or blind pedestrians. It uses algorithms to outline objects in an image and separate them—say, separating a box on the floor from the backdrop of the room’s walls or from other objects on the floor.
“We take that data and turn it into a ‘point cloud’ of the object and translate that point cloud into a nonvisual interface; we are using audio as a sensory substitution. Sight becomes tones, essentially,” Udell said, comparing navigation using this sound information to using echolocation or sonar.
Cydalion “has a library of different sounds for objects that are on your left or in front of you or on your right,” he said. Based on the position, height, and proximity of the object, the app plays a combination of sounds and provides haptic cues, letting the user know where potential obstacles are located. Haptic feedback is perceived using the sense of touch; Cydalion causes the phone to vibrate if the haptic option is turned on. Vibration or tone intensity and speed vary according to the proximity of the objects. Users can configure the vibration and tones. The user interface is customizable, too; users with low vision or colorblindness can choose a configuration that is easier for them to read.
In addition to computer vision, Cydalion uses elements of machine learning, a type of artificial intelligence. The software can “understand” what it detects in the environment, including depth and distance, and enable Cydalion to respond appropriately.
Tango “solved” a problem common to many augmented reality apps: In Pokémon Go, for example, the Pokémon characters can “drift” as a player moves toward them, and the player has to keep finding them again. “Tango nullifies that drift so objects that are placed into a spot stay anchored in that spot,” Udell said. “It’s a technology known as area learning, and that’s what we actually use in Cydalion to establish some level of object permanence.” Thus Cydalion provides users an accurate idea of where they are in relation to those objects, even when the users are moving.
Machine learning offers future directions for Cydalion
Cydalion fits into Float’s overarching goal of seeking new directions in eLearning, Udell said. While it obviously doesn’t fit the typical model of an eLearning “course,” it is a form of performance support: “You’re providing just-in-time information to somebody,” he said. “They can use it; it affects their behavior; and the outcome is more successful on the other end of it all.”
Outside-the-box thinking is essential to creating innovative eLearning. “Float’s heritage is building mobile learning and, by extension, performance support applications,” Udell said. “We’ve had a significantly different view of what eLearning is and what it can be, what the possibilities of all these types of cool technologies are, and what can happen with them when they’re used in different and new ways.”
The company’s research led them to examine augmented reality. Lots of people look at AR and think of ways to use it in gaming and entertainment, Udell said, but the “level of utility and usefulness in this space, especially in terms of solving real-world problems, serious problems, seemed to be somewhat lacking.”
When studying the AR platforms available to them as application developers, Udell said, they started to wonder: “If these devices can see so much about the world and understand so much about the spaces that are around us, why couldn’t we maybe try to translate that into something that people could benefit from, people that live with blindness or low vision?”
That was the genesis of the Cydalion concept, but, Udell said, the initial conversations weren’t grandiose: “Comically, one of the things that we first thought about was using AR in the Tango platform to assist people with sending text messages while they walked down the street. But if we could use it for something as frivolous and goofy as that, why couldn’t we try and apply it to something real and meaningful and useful?”
Tango and Cydalion are self-contained; the sensors do not require any web or data connection. Future features might rely on data libraries that users would download, though, which could require at least a temporary connection. Udell said that future versions of Cydalion might add “wayfinding tools” that will help users locate doors, for example, or provide navigation inside of buildings—directions to a specific shop at a mall, or a courtroom, or an office in a building, for example. This service, called micronavigation, essentially picks up where GPS leaves off. Cydalion could use stored information from the user’s previous visits to a place, or data that other users have uploaded to a Cydalion data library. If a user flies into Chicago’s O’Hare International Airport, for example, the app could recognize the location and offer the user directions to a gate or a specific restaurant.
Another possible future feature would use technology that Float is already working on: recognizing faces or objects. Using machine learning, it’s possible to “train” an application to recognize specific individuals from photos; if Cydalion implemented this feature, the app could alert a user when those individuals were nearby. It’s also possible that the app could be “taught” to recognize categories of items, like chairs, by performing what is essentially high-powered pattern detection. With this information, Cydalion could help users find doors, trash cans, chairs, elevators—anything it had been taught to recognize. Those are long-term goals, according to Udell. “We do have the basics of those types of technology building blocks inside this application,” and the company is exploring the possibilities for future enhancements, Udell said. “The possibilities are amazing.”