Lip reading
Interview with
You’ve probably noticed that it’s much easier to comprehend what someone’s saying when you can watch their lips moving. This suggests that there are connections between the brain’s visual system and the auditory pathways that decode speech. So what happens to these connections in people who are born blind? The answer is that they’re enhanced. And if you track the activity in what would normally be the brain’s visual areas in these patients you can see flurries of nerve cell firing that follow the patterns of speech sounds when a person listens. Speaking with Chris Smith, Olivier Collignon...
Olivier - So one of the main objectives of this research is to try to understand how brain region develops in the case sensory deprivation. And a model we use is to work with early blind people, and more particularly to try to understand all the regions that are typically dedicated to vision, like the occipital cortex in sighted people develops in early blind people.
Chris - So when a newborn baby first pops out, how well developed is the visual part of the brain in terms of its ability to see?
Olivier - So that is an excellent question. Actually, what we see is that the visual system is already highly organized at day one. As soon as the baby is born, it is born with an architecture that is dedicated in the occiptial cortex - the back of the brain - to process visual input. And this highly imprinted organisation led to the assumption that there was like a genetic masterplan creating an occipital cortex dedicated to vision. But does that mean that this region stays like this in case of sensory deprivation, for example? This is not the case and this is what we observe. It's a phenomenon that we call cross-model plasticity. When the typical region in the occipital cortex do not receive that input like, for example, the visual input they reorganized themselves to process non-visual input like auditory or tactile inputs.
Chris - How do you know that they're processing those non-visual inputs?
Olivier - So if you work with animals, for example, you can implant electrodes, micro electrodes in these region and record it directly brain activity, actually even single neuron activity. And what you will see is that this neuron which typically fires to flash of lights, for example in sighted people, they will start to fire to autoriy or tactile imputs in visually-deprived animals. When we do that in humans, what we do is that we use neural imaging techniques. In our study, we focused on another technique which we call "magnetoencephalography", a technique that record the very faint magnetic field that is produced when neurons process information. And this technique has the very beautiful advantage in combining eye spatial and eye temporal resolution, so we can basically with this technique address how the brain processes information in space and time.
Chris - So talk us through the design of the experiments you did then. Who did you look at and how did you study them?
Olivier - So we involve early blind people and a group of sighted controls. They listened to short stories while we recorded their brain activity with magnetoencelephalography. We correlated the change in brain activity with like what we call the envelope of speech, which is the change of auditory energy across time. And while we correlate these two things, basically it's proof that these regions participate in the sensory processing of the language inputs.
Chris - So this is when you record the "seeing" parts of the brain, you can show that there is also a sympathetic firing there which follows or is entrained to the rhythmic pattern of speaking that they're listening to. And that's present in both seeing and blind or exclusively in the early blind people?
Olivier - So what we see - its an excellent question - it's enhanced in the blind. It's not that it's not at all present in the sighted people but it's more there in the blind.
Chris - And did the sighted controls have their eyes closed when you did this study to make sure they weren't being distracted by visual stimuli which might mask the speech effect?
Olivier - Right. So both groups actually were blindfolded during the experiment. We just also do that with blind people to put them in the exact same condition as the control group.
Chris - And your observation suggests that there's some kind of projection from the regions of the brain which decode specifically speech and the pattern of speech, and those projections relay that activity onto the visual areas. Why do you think they would want to do that?
Olivier - This is an excellent question. Actually preview studies, notably from our lab, but also from different groups of researchers, have already shown that the occipital cortex participates in non-visual processing in early blind people. I'll give you a simple example; for example you have the occipital cortex of sighted people, a region that is called V5. It participates in motion processing, visual motion processing. What we see is that in blind this region specifically dedicate to auditory or tactile motion processing, like if this region maintained a function but switch the inputs. But in the case of speech or language this is actually, at the start, more puzzling why language would remap in the occipital cortex right? What is the link between vision and speech processing? And we believe the link is that when we process speech information, sighted people actually constantly integrate visual and auditory information, even if we don't realise it we lip read. Therefore we believe that there is some kind of intrinsic connection or privileged connection between the visual and auditory cortex to integrate the visual and auditory part of speech processing.
Chris - Do you think that what you found here is what underpins phenomena like the McGurk Effect, where you play someone one sound and show them someone saying a different sound and they hear a third sound? Could this be why?
Olivier - So this is a very good point. Actually the phenomenon of the McGurk Effect very beautifully illustrates our speech or language is a multi-sensory percept. And, obviously, for this phenomenon to arise in the brain it means that we need to connect auditory and visual input for this integration. And what we believe is that this is exactly these connections between auditory and visual brain centres that are recycled in blind people to basically expand the response properties of the occipital cortex toward some processing and, in this case in particular, speech processing.
- Previous Multicellular Life
- Next Reactive Oxygen causes Insulin Resistance
Comments
Add a comment