Robotica debuts sign language AI
UK start-up Robotica have debuted its sign language AI.
The AI auto-translates to British Sign Language (BSL) and Makaton. The sign language can be created live, although this is currently just for straightforward applications such as weather and railway announcements. Anything more complex has to be created, rendered and applied in post-production.
The eventual aim for Robotica is to be able to provide live sign language for a wider range of content, beginning with children’s programming.
For now, the customer delivers the AV content and transcription to Robotica, which uses machine learning and human input to generate the sign language AI.
The avatar is then rendered out and the output QC’ed with registered sign language specialists to ensure its accuracy. It is then re-rendered if necessary. It currently takes around 4-8 hours to create the BSL avatar for a one-hour production.
Robotica was established two years ago, and has developed avatars for British Sign Language and Makaton. It is now developing AI to sign American, Italian and other sign language, as well as visual signing systems such as Cued Speech.
The sign language actions of the AI avatars have been motion captured by Robotica, which has already captured around 8,000 English words, which equates to around 2,000 BSL words and concepts.
The BSL avatar is stylised to look quite human-like, whereas the Makaton avatar is designed to appeal to children. The avatars are on version 7, with the enhancements mostly focusing on providing clearer hand signs and facial expressions.
Robotica says it’s the first company globally to use broadcast-standard AI signing avatars for television programming. It describes its BSL avatars as “ultra-realistic, human-like digital signers”.
The technology enables the company to translate everything into sign language, at scale.
Sign languages are especially complex to create as they don’t share grammar or concepts with the spoken word equivalents.
For many deaf people, reading English can be difficult or impossible, and subtitles and audio description may be of no help. “For children in particular, subtitles just don’t work,” says Catherine Cooper, a deaf culture consultant. “We need sign language on TV as that’s the language we think and speak.”
Robotica says the broadcast sign language industry is expected to be worth £366 million within five years.
Robotica CEO Adrian Pickering said: “There’s a global shortage of sign language translators and interpreters. It’s a tough job and takes years to learn. Even if there were a hundred times as many translators, there still wouldn’t be near enough to meet the demands of a content-hungry digital world. Last year, the BBC released 28,000 hours of new content. Every single hour, tens of thousands of new page pages are crafted, 30,000 hours of new videos are uploaded to YouTube. The only way that sign language users can gain equality of access to information and entertainment is with machine translation.”
Robotica co-founder Michael Davey added: “Human translations will always be first choice. Anything that can be signed by human interpreters should be signed by human interpreters. You don’t want a computer giving your diagnosis or reporting a disaster. There will always be a need for empathy, for the personal touch. We’ll just translate everything else.”