Dive Brief:
- Researchers at Brigham Young University are developing “Signglasses,” which will project sign language narration onto eyeglass lenses via Google Glass and similar wearable tech.
- The system was developed initially for deaf students in planetarium classes — they could not see the American Sign Language interpreter when the lights were off, and they couldn’t see the projections of star constellations when the lights were on.
- The BYU team is also working with Georgia Institute of Technology researchers on other potential uses of Signglasses. One example: Students could use the tool as a video dictionary, projecting sign language definitions of words they encounter while reading a book.
Dive Insight:
The BYU team, led by Michael Jones, assistant professor of computer science, has tested the glasses with high school students from the Jean Massieu School for the Deaf. For the planetarium lectures, students preferred having the sign language interpreter projected on the center of one lens, not at the top, to allow the viewer to look through the interpreter. Funding for the research has come from the National Science Foundation and the Sorenson Impact Foundation. Research results will be published in June at the Interaction Design and Children conference.