Exposure to robots in movies affects our ability to empathise with them

January 23rd, 2010 - 2:12 pm ICT by ANI  

London, Jan 23 (ANI): A new study has shown that the level of exposure to robots is likely to affect a person’s ability to empathise with synthetic beings.

The researchers believe that brain regions thought to be responsible for our ability to relate to each other might be behind the ability to sympathise.

In humans and monkeys, the mirror neuron system (MNS) - a collection of neurons in various parts of the brain, including the premotor cortex and the primary motor cortex - fires both when you perform an action and when you watch someone else perform a similar action.

They have found that MNS plays a key role in creating empathy.

It allows people to transform the actions that they see others performing into internal representations.

Many researchers have tried to see if the MNS is activated when humans observe robots.

However the results have been inconclusive, with some experiments suggesting that the MNS reacts to robots and others concluding the opposite.

“This difference may be due to [differences in] the robot stimuli used in the experiments,” New Scientist quoted Sotaro Shimada of Meiji University in Tokyo, Japan as saying.

During the study, researchers decided to test separately the effects on the MNS of the appearance of robots and the way they moved (their kinematics).

They recruited twelve adults, aged 23. They were asked to watch computer-generated videos of either a humanoid robot or a person reaching out with their right hand to grasp an object.

The hand movements in each case were either smooth (human-like) or jerky and mechanical (robot-like).

The findings showed that MNS was activated when the robot performed actions - but only when the actions were robotic, not when the robot’s motion was smooth and human-like.

In virtual human, exactly the opposite was true - the MNS was activated when the movements were human-like, but not when they were robotic - and the contrast was even greater between these two scenarios.

According to the researchers, MNS seems to respond only when robots move like robots and humans move like humans, not otherwise.

“The human brain detects the inconsistency between the kinematics and the appearance,” said Shimada.

He also said that whether the MNS responds to a robot might depend on the level of exposure a human has had to robots.

“I think experience of the observer may be crucial for the activation of mirror neurons,” he added.

The study appears in journal Brain and Cognition. (ANI)

Related Stories

Tags: , , , , , , , , , , , , , , , , , , ,

Posted in Health Science |