Neuroscience shows blind people can “see” a physical space by listening
When individuals lose one of their senses, their other senses often compensate. LiveScience reported on a study that showed brain scans of people who were born blind or lost their sight before the age of three “had heightened senses of hearing, smell, and touch compared to the people in the study who were not blind.” Similarly, “people deaf from birth may develop a form of “super-vision” to compensate for their lost hearing,” according to The Telegraph.
One example of sensory compensation is when blind individuals master the skill of echolocation – the ability to “see” their environment by listening to the echoes of clicking sounds they make with their mouth. These individuals can navigate in many different environments, including riding a bike or hiking in the woods.
Last week’s blog post was about the sending of the signals and discussed how these expert echolocators’ clicking sounds shared acoustic properties that differ from typical human speech. This week, the focus is on what happens when the sound waves return to the person. How do their brains process the received signal?
How the brain “sees” the clicks
New research from LMU Munich found that not only do blind people hone their compensatory senses, but they actually use different parts of their brains to process the input. The researchers studied how sounds used for echolocation were processed in both blind and sighted people. The researchers found the blind participants’ brains were activated in the region that processes sight — the sounds stimulated the visual cortex.
The study, published in The Journal of Neuroscience, monitored which areas of the brain were activated when human echolocators tried to determine the size of a virtual space. For their research, a sound model of a room was created using MATLAB.
“In effect, we took an acoustic photograph of a chapel, and we were then able to computationally alter the scale of this sound image, which allowed us to compress it or expand the size of the virtual space at will,” stated Lutz Wiegrebe, a professor in the Department of Biology at LMU and lead author of the paper.
Participants were trained to listen to the difference in sound that corresponded to changes in room size. They wore headphones while in a MRI machine and made the mouth clicks. The reverberations were played back to them through earphones. A functional MRI (fMRI) recorded which areas of their brains were active during the experiment. Image processing and data analysis were performed using SPM8.
While all subjects were able to use echolocation successfully to some degree to determine the size of the room, the region of the brain that reacted to the reflected sound differed between sighted and blind subjects. The sounds activated the motor cortex in the sighted individuals, prompting them to make more mouth clicks. In the blind subjects, the visual cortex was activated – they were “seeing” the changes in the room size.
“That the primary visual cortex can execute auditory tasks is a remarkable testimony to the plasticity of the human brain,” stated Wiegrebe.
The LMU researchers are using the information gained in this study to develop a training program that will help blind people learn to use tongue clicks for echolocation.
댓글
댓글을 남기려면 링크 를 클릭하여 MathWorks 계정에 로그인하거나 계정을 새로 만드십시오.