Antwerp students design robotic interpreter for the deaf


By designing a robotic arm, three students from Antwerp University have created something that could one day grow into a fully fledged interpreter

Figure of speech

For their dissertation, three Antwerp students have designed and built a robotic arm that has mastered sign language. Attach the arm to a lifelike robot, and you have a fully robotic interpreter for the deaf.

Stijn Huys, Jasper Slaets and Guy Fierens created the device as part of their Master’s studies in electromechanical engineering. It was Huys (picture, second from right) who came up with the idea. “For my thesis I really wanted to create something that helps people,” he said. “Through a friend I learned that it’s really difficult to enrol in sign language courses inFlandersas there aren’t many available places. So I decided to sink my teeth into that.”

The robotic arm, which they named Aslan, can express every letter of the Flemish sign language alphabet and can mimic every number from zero to nine. Official Flemish Sign Language is used by about 6,000 people in the region.

So how does Aslan do the job? “Aslan knows the complete Flemish manual alphabet,” explains Huys. “That means it expresses words and sentences by spelling out every letter. Of course, using this method, it would take ages to express just one sentence. That’s why the manual alphabet is only a small part of sign language. Deaf people use it to express proper names or words that have no separate sign. In sign language, whole words have separate signs.”

Aslan is still a prototype, a first step towards a full robotic interpreter. “In the coming years, we want to add another arm so that Aslan can extend his current manual alphabet with a real vocabulary,” he explains. “In the end, we even want to add facial expressions, maybe by attaching a monitor. Facial expression is very important in sign language, as it acts as the intonation that we put in our sentences to give them context and feeling.”

Ideal scenarios

The team also wants to develop a new interface for the interpreter. “Now we key in every letter separately in an attached computer,” says Huys. “Later we want to put in whole sentences and let the computer split them up into words. We also want to add software that transforms the grammar of spoken Flemish into that of Flemish Sign Language.”

We want to add another arm so Aslan can extend his manual alphabet with a real vocabulary

- Stijn Huys

In an ideal scenario, a robotic interpreter would be equipped with speech recognition so that it can really be used as an interpreter, alongside a speaker. “I don’t see any difficulties that can prevent us from doing this,” says Huys. “The software we’re using now is already fast enough to keep up with spoken language, and Aslan is quick enough to move its arm with the rhythm of the speaker.”

The team has a number of ideas for how Aslan would be put to work. “When you have a class of schoolchildren with only one or two children who are deaf or hearing impaired, the robot could help diminish the gap between them and the rest of the pupils by simultaneously translating the teacher’s words,” he says.

It’s also a lesson in mechanics. “It would be nice for the children to put the robot arm together themselves. We’ve developed it as a collection of pieces that are 3D printable and that click together.”

Hospitals could also benefit from engaging a robotic interpreter, says Huys. “Doctors or nurses who work in a department where patients with hearing disorders are treated often don’t have the time to learn sign language themselves. So a robot could facilitate the communication with patients.

“A lot of people think that deaf people can read and understand doctors through written text. But that’s often not the case. We learn to read by coupling sounds to letters, and obviously, for people who are deaf from birth, that’s not possible.”