Posted by Neil Bennett and Michael Kan 01 November 2013
Microsoft's Kinect-based sign language translator could help deaf and hearing people communicate easier
Earlier this week, Microsoft's research arm showed a project in Asia that have a positive impact on the lives of deaf people – or those who have hearing loss to a greater or lesser degree – who use sign language. The project uses data captured using the Kinect motion control camera usually attached to its Xbox 360 console, and the translates this into words – which can be read or spoken to assist communication with others. It coud even be used to let sign language practitioners issue commands to computers or other devices using sign.
The reverse is also possible, as words spoken by a hearing person can be translated into sign using a digital avatar on a screen (or holographic projection).
Watch Microsoft's video above to see the tech in action.
We asked Kevin Taylor, technology development manager at disability charity Action on Hearing Loss (previously known as the RNID), if the technology that could be developed by the project would be useful to people who use sign language.
"Action on Hearing Loss welcomes the development of any new technology that can improve communication between deaf and hearing people," he says.
"The Kinect-based sign language translation which enables sign language to be translated into spoken text has the potential to succeed, and we are delighted that there is investment in new technologies to help bridge the communications gap between the hearing and deaf world.
Kevin says he could see sign-language translation technology helping with information kiosks, reception areas and meetings in the near future. Looking beyond that, he says that it could provide automated live-signing for TV and events.
"It is also another fine example of how technology used in one application, in this case gaming, [can be developed for other uses," he says
Microsoft Research began collaboration on the project with the Chinese Academy of Sciences and Beijing Union University in February 2012. Microsoft Research program manager Wu Guobin had tried using video camera or digital gloves before trying the Kinect, but they turned out to be a lot more expensive than the Kinect. After about 18 months of development, the Kinect translator can now recognize 370 of the most popular words in Chinese Sign Language, and American Sign Language. British Sign Language wouldn't prove to be more of a challenge than thse.
The research team hopes to collaborate with more experts in the field and is also surveying the deaf to find the best use cases for the Kinect translator.
It's thought that the system could help deaf users make presentations to non-sign language speaking crowds. Deaf users working at an information kiosk could also more easily communicate with visitors who need help.
It is not known when the technology will arrive in the market, Wu said. Microsoft is still working on improving the language recognition technology, and needs to expand the vocabulary of sign language the system recognizes.
"I think it's been great. In a year and half, we have already developed the system prototype," Wu said. "The results have been published in key conferences, and other researchers have said the results are very good."