Now, a computer to lip-read and decode emotionsSeptember 11th, 2012 - 5:59 pm ICT by IANS
Kuala Lumpur, Sep 11 (IANS) A computer is being taught to interpret human emotions based on lip-reading, one which could improve our interaction with these machines and perhaps allow disabled people to use voice synthesizers, more effectively and efficiently.
Karthigayan Muthukaruppan of Manipal International University in Selangor, Malaysia, and co-workers have developed a system using a genetic algorithm that gets better and better with each use to match irregular ellipse (lip shapes) fitting equations to the shape of the human mouth displaying different emotions.
They have used photos of individuals from South-East Asia and Japan to train a computer to recognize the six commonly accepted human emotions - happiness, sadness, fear, angry, disgust, surprise - and a neutral expression.
The upper and lower lip is each analyzed as two separate ellipses by the algorithm, the International Journal of Artificial Intelligence and Soft Computing reported.
“In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers especially in the area of human emotion recognition by observing facial expression,” the team explained, according to a statement of Manipal University.
Earlier researchers had developed an understanding that allows emotion to be recreated by manipulating a representation of the human face on a computer screen.
However, lips remain a crucial part of the outward expression of emotion. The team’s algorithm can successfully classify the seven emotions and a neutral expression described.
Researchers suggest that initial applications of such an emotion detector might be helping disabled patients lacking speech to interact more effectively with computer-based communication devices, for instance.
- Computers now better at lip-reading than humans - Sep 10, 2009
- Inspired by "I, Robot," Indian American student shares top US science prize - Dec 07, 2010
- New computer program could help identify missing kids, criminals on the run - Mar 12, 2011
- Computer-based approaches effective in tackling sexual health problems - Sep 08, 2010
- Human cues to improve computer's user-friendliness - Mar 06, 2011
- Egg-shaped nanomagnets could support future data storage systems - Apr 28, 2011
- New paper computer, PaperPhone, set to revolutionise smartphones, tablets - May 05, 2011
- Robotic arm to reveal inner working of brain cells - May 07, 2012
- Female robot can sing like a pop star - Oct 15, 2010
- 'Gesture recognition,' robotic nurses to help surgeons of the future - Feb 04, 2011
- Bizarre female robo 'croons by copying human singer' - Oct 15, 2010
- Here's why we ape others' talking style, speed and even accents - Aug 06, 2010
- Interactive computer acts like real teacher - Mar 04, 2012
- Like humans, monkeys too can recall what they've seen! - Apr 29, 2011
- Children can read dog barks better than adults - Oct 14, 2011
Tags: artificial intelligence, communication devices, ellipses, emotion detector, emotion recognition, facial expression, genetic algorithm, happiness sadness, human emotion, human emotions, human mouth, initial applications, interaction between humans, lip reading, lip shapes, manipal university, selangor malaysia, soft computing, south east asia, voice synthesizers