The translator already has a database of over 500 expressions used in sign language. The students say it was the sheer thrill of doing something that would not just earn them accolades, but also make a positive change to the lives of a few people, that egged them on.
Till a few months ago, college, for them was just a ‘four-year grind' that would, at the end of it, land them a plush job.
But something changed the day engineering student Subramanian Muthukrishnan and his classmates at Anna University saw their juniors shout at a waiter in the college canteen. “We were upset to see the students shouting at the poor waiter who is hearing and speech impaired,” says Muthukrishnan.
The waiter seemed to have delayed in getting their food resulting in an argument with the students. “The students were angry and told him they were leaving as it was getting late. He, on the other hand, was trying to tell them that the idlis they wanted were over and so, they should order something else,” says Muthukrishnan.
“We intervened and since we were familiar with the waiter, having interacting with him over the years, we were able to explain the situation to both parties.
This incident prompted him and his friends, Srithar Ramadoss, Sivalingam Ramasamy and Vinoth Kumar Aribalan, to develop a translator that would interpret the expressions and gestures of a person with hearing and speech disability.
It was the sheer thrill of doing something that would not just earn them accolades, but also make a positive change to the lives of a few people, that egged them on, the students say.
“When I began college, I was a little lost. But, it made me realise the importance of communication,” says Muthukrishnan, who hails from Tiruchi.
On hearing the idea, their mentors and project guides were excited and told them that no such project had been discussed in the past 20 years of research. But it wasn't easy and days of struggle bore little fruit. They tasted success when their project was selected by Microsoft at Imagine Cup 2012, a student technology competition that focuses on finding solutions to daily problems.
The translator already has a database of over 500 expressions used in sign language. “Since there is no standard and several local languages, we are trying to include as many words as possible,” says Srithar. The ready-to-deploy tool is already being used by five employees who work as helpers in the college campus. “Their feedback determines its usability,” he says.
Handling project like this is not easy as it requires huge investment. High-end tools, like the ones they used at Microsoft, are extremely expensive, says Srithar.
“Unless there is extensive funding, it is difficult to attempt such a project, he says. The challenge now is to work on accurately interpreting facial expressions, Srithar says. “Our aim is to make the tool useful and accurate. If it is not accurate, it will only lead to more misunderstanding,” says Muthukrishnan.