New computational model may reveal how brain makes sense of natural scenesNovember 20th, 2008 - 2:15 pm ICT by ANI
London, November 20 (ANI): Researchers at Carnegie Mellon University say that they have developed a computational model that can prove helpful in understanding how the brain makes sense of the natural scenes surrounding any individual.
Michael S. Lewicki, associate professor in Carnegie Mellon’’s Computer Science Department and the Center for the Neural Basis of Cognition, says highlights the fact that a type of visual neuron called simple cells can detect lines and edges, but the computation they perform is insufficient to make sense of natural scenes.
Given that variations in the foreground and background surfaces within the scene generally obscure the edges, the researcher insists that more sophisticated processing is necessary to understand the complete picture.
However, little is known about how the visual system accomplishes this feat, according to the researcher.
Lewicki and his graduate student Yan Karklin have revealed that their new computational model of this visual processing employs an algorithm that analyses the myriad patterns that compose natural scenes, and statistically characterizes them to determine which patterns are most likely associated with each other.
The researchers say that the response of their model neurons to images used in physiological experiments matches well with the response of neurons in higher visual stages.
Though the complex cells, so called for their more complex response properties, have been extensively studied, the role they play in visual processing has been elusive.
“We were astonished that the model reproduced so many of the properties of these cells just as a result of solving this computational problem,” Nature magazine quoted Lewicki as saying.
The researchers are of the opinion that gaining a deeper understanding of how the brain perceives the world may help improve computer vision systems. (ANI)
- Technique 'poised to untangle brain's complexity' developed - Apr 11, 2011
- Neurons could inspire next generation of computers - Jul 24, 2010
- How nerve cells unravel jumbled information - Nov 21, 2011
- Incoming stimuli behind neuronal diversity - Aug 30, 2010
- How neurons in the brain decide how to transmit information - Mar 26, 2011
- Online searchers can help strangers make sense of data - May 08, 2012
- Robotic arm to reveal inner working of brain cells - May 07, 2012
- Neuroscientists track how brain cells process information - Jul 13, 2011
- Found! Cells that drive brain's adaptability - Jan 02, 2012
- How the brain compresses visual info - Feb 11, 2011
- New technique points out cellular-level changes within deep brain regions - Jan 19, 2011
- How visuals signals travel from eye to the brain - Oct 07, 2010
- 'Facebook neurons' could shed light on brain's centre of higher learning - Jan 11, 2011
- Now, paralysed people can play music just by thinking about it! - Mar 19, 2011
- Scientists peer into real time brain cell activity - May 31, 2011
Tags: algorithm, associate professor, carnegie mellon university, cells, computational model, computational problem, computer science department, computer vision systems, deeper understanding, foreground and background, graduate student, model neurons, nature magazine, neural basis, physiological experiments, researcher, response properties, variations, visual neuron, yan