Automatic sign language tool can translate gestures into 'readable language'
comments
For years scientists have worked to find a way to make it easier for deaf and hearing impaired people to communicate.
And now it is hoped that a new intelligent system could be about to transform their lives.
Researchers have used image recognition to translate sign language into 'readable language' and while it is early days, the tool could one day be used on smartphones.
Researchers have used image recognition to translate sign language (pictured) into 'readable language' and while it is early days, the tool could one day be used on smartphones
Scientists from Malaysia and New Zealand came up with the Automatic Sign Language Translator (ASLT), which can capture, interpret and translate sign language.
It has been tested on gestures and signs representing both isolated words and continuous sentences in Malaysian sign language, with what they claim is a high degree of recognition accuracy and speed.
Its creators say that it has the potential for use in multiple languages.
The tool uses image processing and pattern recognition to translate actions into words.
Life could get a little easier for visually impaired people too as scientists from Georgia Tech have created vibrating gloves (pictured) which have help people learn to read Braille more easily
AND VIBRATING GLOVES MAKE LEARNING BRAILLE EASIER
Vibrating gloves have been designed that help people learn to read Braille more easily.
Scientists at Georgia Tech placed vibrating motors at the knuckle of each finger in the gloves.
When one of the motors vibrates, the wearer presses a corresponding key and the system tells them which letter they are typing.
The gloves were tested on people who had never learned Braille before.
They were then distracted for 30 minutes by playing a game and half of the participants' gloves kept buzzing so they kept learning passively.
People wearing the gloves that kept buzzing made 30 per cent less errors than those who did not benefit from the passive haptic training.
They could read 70 per cent of a Braille phrase when tested, compared to those whose training stopped during the game, who could only read 22 per cent.
'At the heart of the ASLT are real-time image processing and computational intelligence methods,' said researcher Professor Rini Akmeliawati, of Malaysia's IIUM University.
'We developed a novel approach, leading to efficient detection and tracking of face, hands and upper body trajectories of a signer.
'By combining it with our tools for artificial intelligence-based matching between these sign trajectories and elements of a large database of images and video recordings of native signers, we have achieved a fast and flexible automatic sign language translation system.
'The system's potential lies in its technologically advanced algorithms and structure, which can be adapted to a multitude of the world's sign languages.'
Everyday communication is a major challenge to a great many hearing-impaired people, as well as those unable to speak, all around the world.
Until now, systems devised to remove these barriers to communication have had limited capabilities in terms of target languages or ease of use.
According to the study in the Institution of Engineering and Technology's (IET) The Journal of Engineering, the early stage technology could be economical enough for mass production and for use on mobile devices.
The scientists believe that their creation will result in a portable, efficient and affordable ASLT for a wide variety of sign and written languages.
Put the internet to work for you.
0 comments:
Post a Comment