Now, ‘Clone’ faces in real time to better understand human behaviourJune 1st, 2009 - 12:40 pm ICT by ANI
Washington, June 1 (ANI): Computer scientists at the University of East Anglia (UEA) have come up with a new technique to clone facial expressions in real time during live conversations, which they say may be helpful in understanding what influences people’s behaviour while they communicate with others.
Writing about their work in the journal Language and Speech, the researchers have revealed that their technique tracks in real time facial expressions and head movements during a video conference, and maps these movements to produce a ‘cloned’ face.
The researchers strongly believe that such facial expressions and head movements can be manipulated live to alter the apparent expressiveness, identity, race, or even gender of a talker.
Dr Barry-John Theobald of UEA’s School of Computing Sciences-who has developed the new facial expression cloning technique in collaboration with Dr Iain Matthews (Disney Research), Prof Steven Boker (University of Virginia) and Prof Jeffrey Cohn (University of Pittsburgh)-says that it is already being trialed by psychologists in the US to challenge pre-conceived assumptions about how humans behave during conversations.
The researcher has revealed that the new software has shown that people generally move their heads differently when speaking to women than when speaking to men not because of their conversational partner’s appearances, but instead due to the way they move.
If a person appears to be a woman but moves like a man, others will respond with movements similar to those made when speaking to a man.
The team also reckons that their software may turn out to be beneficial for the entertainment industry where life-like animated characters might be required.
“Spoken words are supplemented with non-verbal visual cues to enhance the meaning of what we are saying, signify our emotional state, or provide feedback during a face-to-face conversation. Being able to manipulate these properties in a controlled manner allows us to measure precisely their effects on behaviour during conversation,” said Dr. Theobald, lead author of the new paper.
“This exciting new technology allows us to manipulate faces in this way for the first time. Many of these effects would otherwise be impossible to achieve, even using highly-skilled actors,” added Dr. Theobald. (ANI)
- Study finds hemodynamic responses for infant's facial expressions - Nov 06, 2010
- A machine that can tell when you're lying - Mar 27, 2012
- Is facial attractiveness all down to looks? - Nov 02, 2010
- Will the Woolly mammoth be brought back to life? - Feb 05, 2011
- Non-verbal cues can unmask seasoned liars - Jul 20, 2011
- Babies laugh and cry in womb - Sep 16, 2011
- Study finds new tool for early detection of bowel disease - Dec 07, 2010
- Now, ears too can be used for airport security ID checks - Oct 11, 2010
- Men switch off emotionally during rows - Sep 30, 2010
- Kashmir varsity produces world's first cloned pashmina goat - Mar 13, 2012
- For embarrassing memory lapses blame your neurons - Jul 29, 2010
- Mice show pain via facial expressions, just like humans - May 10, 2010
- Meet the virtual Eve who can detect nine facial expressions - Dec 20, 2010
- New computer program could help identify missing kids, criminals on the run - Mar 12, 2011
- 'Looks may decide whether to trust a person' - May 16, 2012
Tags: computer scientists, computing sciences, conversational partner, dr barry, east anglia, expressiveness, facial expression, facial expressions, human behaviour, iain matthews, jeffrey cohn, john theobald, journal language, live conversations, research prof, spoken words, steven boker, trialed, university of east anglia, visual cues