Researchers from Ontario this week have identified which parts of the brain are involved in human echolocation. Traditionally associated with bats, whales and sonar, echolocation is a technique used to find one’s position by bouncing sound off surfaces and waiting for the echo.
The study, performed by Lore Thaler and colleagues at The University of Ontario and the Rotman Research Institute, compared the brain activity between two study subjects. One had been blind since thirteen months; the other had developed blindness in adolescence.
Both these individuals used clicking sounds, made with their own mouths, to glean information about their surroundings. Both blind subjects were able to tell when a panel placed before them was flat or concave and whether it was 20 degrees to the right or left. Outdoors, they could tell if they were standing in front of a car, tree or lamppost. But, the researchers had to overcome the problem of echo-locating inside the brain scanning fMRI machine, where it’s noisy and there’s nowhere to go. So what they did was to pre-record echolocating sounds from microphones at the subject’s ears and then played the recording back to them inside the machine.
Publishing in PloS One, they found that in both blind subjects the calcarine cortex, an area of the brain normally dedicated to processing visual information in sighted people, displayed greater activity when the subjects listened to the echo sounds. This implies that they could see something from the sound of the echoes. This was compared to fairly constant levels of activity in the part of the brain typically used for processing auditory information when they listened to the echoless clicks – so it does look like they’re seeing with sound rather than hearing.