We recognise faces from area around eyes: Indian American scientistMarch 19th, 2009 - 4:10 pm ICT by IANS
Washington, March 19 (IANS) Recognising faces comes to us as easily as breathing, but how this is done has been a source of abiding mystery in neuroscience and psychology. Now an MIT study led by an Indian American says we recognise faces because a person’s eyes appear darker than the forehead and cheeks.
The study looks at a particularly striking instance of failure: our impaired ability to recognise faces in photographic negatives. It suggests that a large part of the answer might lie in the brain’s reliance on a certain kind of image feature.
The work could potentially lead to computer vision systems, for settings as diverse as industrial quality control or object and face detection.
The results and methodologies could help researchers probe face-perception skills in children with autism, who are often reported to experience difficulties analysing facial information.
Anyone who remembers the days before digital photography has probably noticed that it’s much harder to identify people in photographic negatives than in normal photographs.
“You have not taken away any information, but somehow these faces are much harder to recognise,” said Pawan Sinha, an associate professor of brain and cognitive sciences at the MIT and senior author of the study.
Sinha has previously studied light and dark relationships between different parts of the face, and found that in nearly every normal lighting condition, a person’s eyes appear darker than the forehead and cheeks.
He theorised that photo negatives are hard to recognise because they disrupt these very strong regularities around the eyes.
To test this idea, Sinha and his colleagues asked subjects to identify photographs of famous people in not only positive and negative images, but also in a third type of image in which the celebrities’ eyes were restored to their original levels of luminance, while the rest of the photo remained in negative.
Subjects had a much easier time recognising these “contrast chimera” images. According to Sinha, that’s because the light/dark relationships between the eyes and surrounding areas are the same as they would be in a normal image.
Similar contrast relationships can be found in other parts of the face, primarily the mouth, but those relationships are not as consistent. “The relationships around the eyes seem to be particularly significant,” says Sinha.
Other studies have shown that people with autism tend to focus on the mouths of people they are looking at, rather than the eyes, so the new findings could help explain why autistic people have such difficulty recognising faces, said Sinha, according to an MIT release.
These findings appeared in the Proceedings of the National Academy of Sciences.
Tags: american scientist, children with autism, cognitive sciences, computer vision systems, digital photography, face detection, face perception, famous people, forehead, image feature, industrial quality control, lighting condition, luminance, methodologies, negative images, neuroscience, perception skills, photo negatives, photographic negatives, striking instance