IBM Research and Carnegie Melon University scientists are making history in the assistive technology area that will impact the lives of blind people everywhere. They have designed an open platform to support the creation of smartphone apps that enables blind people to navigate their immediate environment.
IBM and CMU used the platform as a springboard to create NavCog, a pilot app that draws on existing sensors and cognitive technologies to inform blind people on the CMU campus about their surroundings by “whispering” into their ears through earbuds or by creating vibrations on smartphones.
The app analyzes signals from Bluetooth beacons located along walkways and from smartphone sensors to help enable users to move without human assistance, whether inside campus buildings or outdoors. Researchers are exploring additional capabilities for future versions of the app to detect who is approaching and what is their mood. NavCog is now available online and will soon be available at no cost on the App Store.
The first set of cognitive assistance tools for developers is available via the cloud through IBM Bluemix. The open toolkit consists of an app for navigation, a map editing tool and localization algorithms that can help the blind identify in real time where they are, which direction they are facing and additional surrounding environmental information. The computer vision navigation application tool turns smartphone images of the surrounding environment into a 3-D space model to help improve localization and navigation for the visually impaired.
“While visually impaired people like myself have become independent online, we are still challenged in the real world. To gain further independence and help improve the quality of life, ubiquitous connectivity across indoor and outdoor environments is necessary,” said IBM Fellow Chieko Asakawa, a visiting faculty member at Carnegie Mellon.
Asakawa believes this open platform will accelerate the advancement of cognitive assistance research by giving developers opportunities to build various accessibility applications and test non-traditional technologies such as ultrasonic and advanced inertial sensors to assist navigation.”
Even though visually-impaired people are gaining on-line independence, they are still challenged in other areas. To increase their independence and improve their quality of life, ubiquitous connectivity across indoor and outdoor environments is vital.
The combination of these multiple technologies is known as “cognitive assistance,” an accessibility research field dedicated to helping blind people gain information by augmenting missing or weakened abilities. Scientists will add various localization technologies, including sensor fusion, which integrates data from multiple environmental sensors for highly sophisticated cognitive functioning, such as facial recognition in public places. Scientists also are exploring the use of computer vision to characterize the activities of people in the vicinity and ultrasound technology to help identify locations more accurately.
“With our long history of developing technologies for humans and robots that will complement humans’ missing abilities to sense the surrounding world, this open platform will help expand the horizon for global collaboration to open up the new real-world accessibility era for the blind in the near future,” said Martial Hebert, director of the Robotics Institute at Carnegie Mellon.
IBM has been committed to technology innovation and accessibility for people with disabilities for more than 100 years, helping to ensure that employees, customers and citizens have equal access to information they need for work and life. Some if its products include a Braille printer, a talking typewriter and the first commercially viable screen reader.