Is it realistic to assume that signglasses can change the world for deaf people? It is possible because a team of scientists developed a system to project the sign language narration onto several types of glasses – including Google Glass.
Where can signglasses technology be useful to deaf students? Think about a planetarium. Ordinarily, deaf students are left in the dark when they visit a planetarium. With the lights off, they can’t see the ASL interpreter who narrates their tour of outer space. With the lights on, they can’t see the constellations of stars projected overhead.
To solve this problem, a group at Brigham Young University under the leadership of Professor Mike Jones launched the “Signglasses” project. The project is personal for Tyler Foulger and other student researchers because they were born deaf. Experiments are conducted with deaf children in the planetarium. They put on the glasses and watch a movie with an interpreter on the screen of the glasses.
“Having a group of students who are fluent in sign language here at the university has been huge,” Jones said. “We got connected into that community of fluent sign language students and that opened a lot of doors for us.”
The BYU team tests the system during field trip visits by high school students at Jean Messieu School for the Deaf. The tests revealed that the signer should be displayed in the center of one lens. That surprised the researchers, who assumed there would be a preference to have video displayed at the top, like the way Google Glass normally does it.
The potential for this technology goes beyond planetarium shows. The team is also working with researchers at Georgia Tech to explore signglasses as a literacy tool. How can it work as a literacy tool?
“One idea is when you’re reading a book and come across a word that you don’t understand, you point at it, push a button to take a picture, some software figures out what word you’re pointing at and then sends the word to a dictionary and the dictionary sends a video definition back,” Jones said.
The results will be shared at the ‘Interaction Design and Children (IDC)’ seminar at Aarhus University in Denmark June 17-20.