Behind the Headlines

MATLAB and Simulink behind today’s news and trends

Neuroscience shows blind people can “see” a physical space by listening

When individuals lose one of their senses, their other senses often compensate. LiveScience reported on a study that showed brain scans of people who were born blind or lost their sight before the age of three “had heightened senses of hearing, smell, and touch compared to the people in the study who were not blind.” Similarly, “people deaf from birth may develop a form of “super-vision” to compensate for their lost hearing,” according to The Telegraph.

One example of sensory compensation is when blind individuals master the skill of echolocation – the ability to “see” their environment by listening to the echoes of clicking sounds they make with their mouth. These individuals can navigate in many different environments, including riding a bike or hiking in the woods.

Last week’s blog post was about the sending of the signals and discussed how these expert echolocators’ clicking sounds shared acoustic properties that differ from typical human speech. This week, the focus is on what happens when the sound waves return to the person. How do their brains process the received signal?

How the brain “sees” the clicks

New research from LMU Munich found that not only do blind people hone their compensatory senses, but they actually use different parts of their brains to process the input. The researchers studied how sounds used for echolocation were processed in both blind and sighted people. The researchers found the blind participants’ brains were activated in the region that processes sight — the sounds stimulated the visual cortex.

The study, published in The Journal of Neuroscience, monitored which areas of the brain were activated when human echolocators tried to determine the size of a virtual space. For their research, a sound model of a room was created using MATLAB.

“In effect, we took an acoustic photograph of a chapel, and we were then able to computationally alter the scale of this sound image, which allowed us to compress it or expand the size of the virtual space at will,” stated Lutz Wiegrebe, a professor in the Department of Biology at LMU and lead author of the paper.

 

Sound model of a real enclosed space. A, Photograph of the physical room (Old St. Stephanus, Gräfelfing, Germany) with the head-and-torso simulator. B, Spectrograms of the left and right sound models. C, Changes of the size of a virtual room after it is compressed with factors of 0.7, 0.5, and 0.2, respectively. Bottom, Spectrograms of the left-ear room impulse response corresponding to the three compression factors. The color scale is identical to the second row. Image Credit: The Journal of Neuroscience, Wiegrebe, et al.

 

Participants were trained to listen to the difference in sound that corresponded to changes in room size. They wore headphones while in a MRI machine and made the mouth clicks. The reverberations were played back to them through earphones. A functional MRI (fMRI) recorded which areas of their brains were active during the experiment. Image processing and data analysis were performed using SPM8.

 

Active versus passive echolocation. A, Behavior: subjects’ rating of the perceived room size, in both active (blue) and passive (red) echolocation, for the four room sizes. B, Neuroimaging: differential activations between active and passive echolocation show stronger motor activity during active echolocation. Image Credit: The Journal of Neuroscience, Wiegrebe, et al.

 

While all subjects were able to use echolocation successfully to some degree to determine the size of the room, the region of the brain that reacted to the reflected sound differed between sighted and blind subjects. The sounds activated the motor cortex in the sighted individuals, prompting them to make more mouth clicks. In the blind subjects, the visual cortex was activated – they were “seeing” the changes in the room size.

“That the primary visual cortex can execute auditory tasks is a remarkable testimony to the plasticity of the human brain,” stated Wiegrebe.

The LMU researchers are using the information gained in this study to develop a training program that will help blind people learn to use tongue clicks for echolocation.

|
  • print

Comments

To leave a comment, please click here to sign in to your MathWorks Account or create a new one.