Signglasses Project Uses Modified Google Glass to Project Sign Language

Wednesday, May 28, 2014

Signglasses

 Augmented Reality
A group of hearing impaired university students and their professor have developed a system to display video narrating planetarium shows onto glasses worn by deaf students.




A typical disadvantage for hearing impaired people is that they can't hear what is going on when the lights are off. That’s why a group at Brigham Young University launched the “Signglasses” Project. Professor Mike Jones and his students have developed a system to project the sign language narration onto several types of glasses – including Google Glass.

The project is personal for Tyler Foulger and a few other student researchers because they were born deaf.

"Having a group of students who are fluent in sign language here at the university has been huge. We got connected into that community of fluent sign language students and that opened a lot of doors for us."



“My favorite part of the project is conducting experiments with deaf children in the planetarium,” Tyler wrote. “They get to try on the glasses and watch a movie with an interpreter on the screen of the glasses. They're always thrilled and intrigued with what they've experienced. It makes me feel like what we are doing is worthwhile.”

By sheer coincidence, the only two deaf students to ever take Professor Jones’ computer science class – Kei Ikeda and David Hampton – signed up just as the National Science Foundation funded Jones’ Signglasses research. Soon after the Sorenson Impact Foundation provided funding to expand the scope of the project.

Related articles
“Having a group of students who are fluent in sign language here at the university has been huge,” Jones said. “We got connected into that community of fluent sign language students and that opened a lot of doors for us.”

The BYU team tests the system during field trip visits by high school students at Jean Messieu School for the Deaf. One finding from the tests is that the signer should be displayed in the center of one lens. That surprised the researchers, who assumed there would be a preference to have video displayed at the top, like the way Google Glass normally does it. Deaf participants preferred to look straight through the signer when they returned their focus to the planetarium show.

The potential for this technology goes beyond planetarium shows. The team is also working with researchers at Georgia Tech to explore Signglasses as a literacy tool. The glasses may help deaf children learn how to read.

“One idea is when you’re reading a book and come across a word that you don’t understand, you point at it, push a button to take a picture, some software figures out what word you’re pointing at and then sends the word to a dictionary and the dictionary sends a video definition back,” Jones said.

Jones will publish the full results of their research in June at Interaction Design and Children. But his favorite part of the project happens after the test shows end and the high school students just get to talk with his BYU students.

“They see deaf university students succeeding and doing cool stuff,” Jones said. “It’s really rewarding."




SOURCE  Brigham Young University

By 33rd SquareEmbed

0 comments:

Post a Comment