Using cochlear implants to cure deafness
Hearing and understanding speech is something many of us take for granted, but hearing loss is something that affects huge numbers of people. It’s estimated 1 in 6 people in the UK have some form of hearing loss or deafness. And one cause is damage to the inner ear or cochlea where sound waves are converted into nerve impulses that the brain can then understand. For some people a device called a cochlear implant - or as the Australian inventors dubbed it “bionic ear” - can be used to do the same job, but wearers often initially struggle to understand speech. Scientists have been looking into why, and they’ve come up with some strategies to help, as Tom O’Hanlon has been hearing, first from Bob Carlyon at the MRC Cognition and Brain Sciences Unit in Cambridge.
Bob - It looks a bit like a hearing aid in the sense that you have a thing behind your ear which has got a microphone in it. This microphone sends the sounds to a little processor worn behind the ear which breaks sounds down into the individual frequency bands, and it then transmits that information across the skin using a little radio frequency transmitter to a receiver stimulator underneath the skin. That then sends that pattern of electrical stimulation to an array of electrodes inserted inside the inner ear and those electrodes stimulate the auditory nerve directly, bypassing the damaged receptor cells which has caused the patient to become deaf in the first place.
Tom - When I’m speaking now, my voice is actually composed of many tiny vibrations happening at different rates or frequencies. You can seamlessly decode these with the helpful duo of the inner ear and the brain. So what’s it like when suddenly an implant does the job of the inner ear? Mel Jewett, ambassador for the National Cochlear Implant Users Association, is one of half a million people worldwide to have this implant and she took me through her experience…
Mel - When I try and describe what it’s like to hear with a cochlear implant, one way that makes sense to me is that natural hearing that we have is like an acoustic guitar, but hearing with an implant is like an electric guitar. It does take a lot of getting use to but, over time, it does become natural. My dad’s voice now sounds like how I remember my dad’s voice. My mum’s like my mums.
Tom - So while we have the luxury of an acoustic guitar world, rich in sound and meaning, for cochlear implant users there’s a tricky transition - an electric guitar world in which speech is much harder to understand. To get an idea of what this might be like, have a listen to this…
Could you make sense of that. I certainly struggled.
Matt - The speech is hard to understand, but if you give people hints and clues about what they’re hearing it starts to sound a lot clearer.
Tom - This is Matt Davies, also at the Cognition and Brain Sciences Unit. His research looks at how our brains understand challenging speech…
Matt - So, if I tell you that the sentence was the man read the newspaper at lunchtime and then play it again…
Tom - I got most of that actually that time.
Matt - It sounds strikingly clearer! This is an illustration of something that’s long been known about speech perception and perception in general. When you’re perceiving something, you’re not only processing the external sensory signals so the sounds in this case. You’re also using your knowledge, your prior knowledge of the world and of the messages and information that you’re expecting to change the way in which you perceive something. So that’s a very striking effect here, when you know what's being said, the speech sounds a lot clearer.
Tom - Do we know what’s going on in people’s brains when they’re unpicking this challenging speech?
Matt - That’s something that we’ve been studying a lot in the last year or to. We’ve been using two different forms of brain imaging to look inside the brain and see what activity is going on when someone listens to degraded speech, just like the ones that I’ve played to you. The theory that we’ve been developing that explains our observations is based on an idea called “predictive coding.” You’ve probably encountered “predictive texting” on your mobile phone now, in crude kind of way, that’s a model for what the brain is doing. So the brain is continuously trying the sensory signals that it’s receiving and, when it process sounds, it’s doing so in a way that’s guided and informed by the predictions that it had. When you know what’s about to be said you have very accurate predictions, and that’s what makes perceiving the degraded speech sounds easier is that your predictions become more accurate; they’re closer to the sounds that you’re hearing.
Tom - Matt and colleagues saw a reduction in brain activity when people in the study knew what they should be hearing compared to hearing degraded speech without subtitles. The same kind of brain response was seen with longer term learning and adapting…
Matt - What we found is really, really very interesting. It shows that once again, the brain predicting the sounds that it’s going to hear that seems to be involved in that tuning in process. So, people who start off an experiment finding degraded speech very difficult to understand, with minutes or hours of training get better and better at understanding that speech, and part of what’s making it better is that they’ve improved their predictions for what that degraded speech will sound like. And that’s a very useful thing to learn because it allows you not only to understand a particular sentence but also to understand other sentences and other speech sounds that you might never have heard in that degraded form. I think it’s very similar to what’s going on for someone with a cochlear implant.
Tom - So by listening to degraded or challenging speech with subtitles, you then get better at understanding degraded speech more generally?
Matt - That’s absolutely right. I was reminded of this when I watched the American TV series “The Wire.” The characters all have a very strong Baltimore accent which I found, initially, very hard to understand. Switching on the subtitles helped me understand what the characters were saying, but it also helped my tune into that unfamiliar accent. So by the time I’d watched two or three programmes actually I could do much better without the subtitles on. Having that extra support that you get from subtitles doesn’t just help you in immediate understanding, it also helps with learning.
Tom - Matt hopes that using these techniques may help implant users adapt to their new hearing more quickly.
Matt - What our research suggests is that during that period of adjusting and adapting it will be helpful for those listeners to watch the TV with the subtitles on. To listen to talking books whilst reading the text of the book. That that extra support doesn’t only help them in their immediate understanding but will also help them to tune in, to adjust to the sensations of sound that they receive through their implant..