Humans can read robotic body languageMarch 23rd, 2009 - 2:29 pm ICT by ANI
London, Mar 23 (ANI): A robots eyes may not be the windows to its soul, but they can certainly help humans guess the machine’’s intentions, according to a new study.
Researchers led by Bilge Mutlu at Carnegie Mellon University, Pittsburgh, have demonstrated that robots “leak” non-verbal information through eye movements when interacting with humans.
Mutlu said that humans constantly give off non-verbal cues and interpret the signals of others, without realising it at a conscious level.
However, the interpretive skills become irrelevant with robots because they leak no information, making it virtually impossible to read their intentions.
The researchers tested strategies to improve robots body language using a guessing game played by a human and a humanoid robot.
The robot is programmed to choose one object from around a dozen resting on a table, without making a move to actually pick it up.
The human must work out the object it has mentally selected, through a series of yes and no questions.
The researchers enrolled 26 participants in the study, who took on average 5.5 questions to work out the correct object when the robot simply sat motionless across the table and answered verbally.
In the second trial, it answered in exactly the same way, but also swivelled its eyes to glance at its chosen object in the brief pause before answering two of the first three questions.
The participants needed fewer questions to identify the correct object (an average of just 5.0 and a statistically significant result) when they came face to face with a robot “leaking” information.
When the experiment was repeated with the lifelike Geminoid with realistic rubbery skin, around three-quarters of participants said that they hadn”t noticed the short glances.
Mutlu said that the improvement in scores suggested that they subconsciously detected the signals, reports New Scientist magazine.
According to him, the study suggests that people make attributions of mental states to robots just like they do to humans, although apparently only as long as the robot appears to be lifelike.
Experts have said that simply giving robots the ability to turn towards a user or nod during a conversation are important for improving the efficiency and quality of human-robot interactions.
Mutlu presented his work at the Human Robot Interaction 2009 conference in La Jolla, California, last week. (ANI)
- Non-verbal cues predict one's trustworthiness - Sep 12, 2012
- Can robots be blamed for killing civilians? - Apr 24, 2012
- Interactive computer acts like real teacher - Mar 04, 2012
- Making human-like robots for easier interaction with people - Mar 08, 2011
- How people respond to being touched by a robot - Mar 10, 2011
- A living robot who smiles, sings like a girl! - Apr 12, 2012
- Projection system to improve human-robot coordination - Jul 10, 2011
- Robot makes debut performance on stage - Nov 16, 2010
- Robots may soon be serving the elderly at home just like humans do - Aug 29, 2009
- Now, a robot that will understand emotions - Aug 24, 2012
- Now come robots that can trick and deceive - Sep 10, 2010
- Japan's Robotic actress takes stage - Nov 12, 2010
- Non-verbal cues can unmask seasoned liars - Jul 20, 2011
- Deceptive robots to revolutionise military, search and rescue operations - Sep 10, 2010
- Robot punches humans to test pain thresholds - Oct 14, 2010
Tags: bilge, body language, carnegie mellon university, choose one, conscious level, eye movements, geminoid, guessing game, hadn, humanoid robot, interpretive skills, mutlu, new scientist magazine, non verbal cues, robots, rubbery skin, signals, study researchers, three quarters, university pittsburgh