Some sounds, such as a speeding car or footsteps in a dark alley, actually improve our eyesight even before we are aware that we can hear them, according to research published in the Journal Current Biology. This gives us cause to rethink the idea that hearing and vision are handled separately in the brain at the input stage.
Gregor Thut, at the University of Glasgow and colleagues in both Glasgow and Lausanne, performed a series of experiments to look at excitability of the low-level visual cortex, and see if it was altered by hearing looming sounds. To do this, they used a technique called Transcranial Magnetic Stimulation – this uses rapidly changing magnetic fields to induce small currents in the neurons. This stimulation leads to the perception of flashes of light, a bit like those you see when you rub your eyes, in a process called Phosphene induction.
In the presence of looming sounds, compared to control sounds, the perception of phosphenes was greatly and selectively enhanced, showing that these sounds do indeed alter the excitability of the visual cortex. They did see an increase in excitability when listening to stationary (constant volume) sounds, but looming sounds doubled the baseline phosphene perception. This increase in excitability actually happened around 35 milliseconds before the volunteers were able to discriminate the sound at all.
A follow up experiment tried to see if this effect was just due to the increasing intensity (the sound getting louder) or the perception of a looming sound getting closer. Previous studies have shown that for a sound to feel like it’s getting closer, rather than just louder, it needs to be a structured, rather than broadband sound. This means that if you generate white noise that gets increasingly loud, it won’t feel like it’s coming closer. Looming white noise increased visual cortex excitability, but only as much as the constant volume sounds.
This shows us that visual perception can be boosted by other senses in a preperceptive way – before we consciously realise what we’re hearing – the brain acts in a multisensory but stimulus selective way. This challenges the current model of the brain, and according to Thut:
“The study shows how models of brain organisation and perception need to be changed to include multisensory interactions as a fundamental component.”