Science Interviews

Interview

Sun, 10th Jan 2010

Language in the Deaf Brain

Dr Mairead McSweeney, Institute of Cognitive Science, UCL

Listen Now    Download as mp3 from the show Listen Here! The Science of Sound and Hearing

Chris - This week, weíre talking all about the science of hearing and sound, but hearing isnít just down to your ears.  The brain plays a crucial role so now, weíre joined by Dr.  Mairead McSweeney whoís from the Institute of Cognitive Science at University College London.  Thatís where she works on looking at how a deaf personís brain deals with language.  Whether that's sign language or lip reading or reading from text, and sheís with us now.  Hello, Mairead.

Mairead -   Hello.

Chris -   Tell us a little bit about the deaf personsí brain.  How does it differ or not from someone who is normally hearing?

" alt="The alphabet of BSL fingerspelling." />Mairead -   Well to date, only a few studies have actually been done to look at the structure of deaf peopleís brains and perhaps surprisingly, these studies suggest very few, if any, anatomical differences between deaf and hearing brains.  So for example, we might wonder what happens to the part of the brain that processes sound in you and I, the auditory cortex, and might predict that this may be atrophied in people who were born deaf.  But that doesnít seem to be the case.  So, in people born deaf, their size of auditory cortex seems to be the same as in hearing people.  What itís actually doing is a different question I can perhaps tell you more about that if you're interested later.  And when we think about function and how the brain processes language, very similar neural systems seem to be recruited by deaf and hearing people.  So in hearing people processing spoken language and deaf people processing - in our case British Sign Language, which is a real language and is fully independent of spoken English  - then the same neural systems seem to be recruited.  And these are predominantly located in the left hemisphere of the brain.

Chris -   So the same bits of a brain that would be decoding language if you were listening to it are being used to decode language arising through other means of communication?

Mairead -   Thatís exactly right, yes.  So, we are comparing languages coming in in very different modalities, so we've got auditory verbal speech, and then we've got visio-spacial sign language, and the same systems seem to be recruited.  And of course, if we think about it, we as hearing people also deal with visual language, so you mentioned lip reading, we see peopleís faces when we speak or we might be reading text, but these are all based on spoken language and we have that auditory system that is involved in processing spoken language and these visual derivatives are then built upon that system whereas with sign language of course, weíre looking at something that doesnít have any auditory component.  So the fact that the same systems are used for spoken language and sign language is very interesting.  It tells us that what the brain is doing is saying thereís something important about language thatís recruiting these regions in the left hemisphere.

Chris -   I was just going to say, how does the brain know this is language and I have to present this information to this other bit of the brain whose job it is to decode language and then parcel it out to the other bits of the brain that then do other aspects of linguistic processing, working out what verbs mean, what the nouns mean, what colours mean, and so on.

Mairead -   Well, thatís the big question that weíre working on really!  So itís one thing for us to say this system in the left hemisphere, involving the certain parts of the brain that have been identified for a long time - Brocaís area and Wernickeís areas being involved in the language processing Ė are also involved in sign language, but the next step for our research is really, what is it about language and the structure of language that is important for these regions?  What is it that is critical and what is it that these regions can do in terms of symbolic processing, or whatever it might be, that is important for language processing.  So thatís the next step for our research.

Approximate location of Broca's and Wernicke's areas highlighted in grayChris -   And you're doing this with brain scanning, so you put people with a hearing impairment of presumably different lengths of time - people whoíve been born deaf versus people who have acquired forms of deafness - who have learned alternative means of communication, and you look at how their brains respond to different stimuli?

Mairead -   Yeah.  Weíre using brain scanning as you say.  So we use something called functional Magnetic Resonance Imaging (fMRI) where we can get an indirect measure of blood flow which tells us which parts of the brain are being used when we show different people stimuli, whether it may be sign language or visual speech or written text.  But actually, we havenít yet compared people who were born deaf with people who become deaf later in life, most of our work is concerned just with people who are born profoundly deaf.  But looking at all of these different groups can address very important questions.  So looking at people who have become deaf later in life will be something weíll do in the future because it all tells us about our critical question,  which is how experience shapes the brain and how plastic the brain is in responding to changes in its environment.

Chris -   And if you look at people in whom the opposite side of the brain is the dominant one because the majority of us are right handed, which means the left side of the brain is the dominant hemisphere and thatís usually where language is.  If you look at people in which that process is reversed, do you also see the sign language and so on being shifted across as well?  Is there always this association between the language bits of the brain and the interpretation of things like sign language?

Mairead -   Well, thatís a good question actually and itís something we have just put in an application to get money to look at!  So actually, looking at sign language processing in deaf people in this way, thereís maybe 20 studies that have been published in this area.  All have focused on people who are right handed,  so we want to have consistency across the people that weíre looking at.  So in fact, there are no studies looking at deaf people who are left handed and looking at the regions that they use in processing language, but that is something that we plan to do in the future.

Chris -   Brilliant.  Well, good luck with it Mairead and do join us when you do discover how it is that the brain manages to puzzle out these different bits of information, and know that they're all about communication and therefore, to put them into the right brain area.

Mairead -   Will do.

Chris -   Great to have you on the program.  Thatís Mairead McSweeney who is from UCL, University College London explaining how a deaf personís brain can process sign language in a very similar way to how a hearing personís brain processes spoken language.

Multimedia

Subscribe Free

Related Content

Comments

Make a comment

Your speculations raise a larger question: Can you think without language? Answer: Nope, at least not at the level humans are accustomed to. That's why deafness can have far more serious consequences than blindness, developmentally speaking. The blind suffer many hardships, not the least of which is the inability to read in the usual manner. But even those sightless from birth acquire language by ear without difficulty in infancy, and having done so lead relatively ordinary lives. A congenitally deaf child isn't so lucky: unless someone realizes very early that he's not talking because he can't hear, his grasp of communication may never progress beyond the rudiments.

The language of the deaf is a vast topic that has filled lots of books--one of the best is Seeing Voices: A Journey Into the World of the Deaf by Oliver Sacks (1989). All I can do in this venue is sketch out a few basic propositions:

The folks at issue here are both (a) profoundly and (b) prelingually deaf. If you don't become totally deaf until after you've acquired language, your problems are . . . well, not minor, but manageable. You think in whatever spoken language you've learned. Given some commonsense accommodation during schooling, you'll progress normally intellectually. Depending on circumstances you may be able to speak and lip-read.

About one child in a thousand, however, is born with no ability to hear whatsoever. Years ago such people were called deaf-mutes. Often they were considered retarded, and in a sense they were: they'd never learned language, a process that primes the pump for much later development. The critical age range seems to be 21 to 36 months. During this period children pick up the basics of language easily, and in so doing establish essential cognitive infrastructure. Later on it's far more difficult. If the congenitally deaf aren't diagnosed before they start school, they may face severe learning problems for the rest of their lives, even if in other respects their intelligence is normal.

The profoundly, prelingually deaf can and do acquire language; it's just gestural rather than verbal. The sign language most commonly used in the U.S. is American Sign Language, sometimes called Ameslan or just Sign. Those not conversant in Sign may suppose that it's an invented form of communication like Esperanto or Morse code. It's not. It's an independent natural language, evolved by ordinary people and transmitted culturally from one generation to the next. It bears no relationship to English and in some ways is more similar to Chinese--a single highly inflected gesture can convey an entire word or phrase. (Signed English, in which you'll sometimes see words spelled out one letter at a time, is a completely different animal.) Sign can be acquired effortlessly in early childhood--and by anyone, not just the deaf (e.g., hearing children of deaf parents). Those who do so use it as fluently as most Americans speak English. Sign equips native users with the ability to manipulate symbols, grasp abstractions, and actively acquire and process knowledge--in short, to think, in the full human sense of the term. Nonetheless, "oralists" have long insisted that the best way to educate the deaf is to teach them spoken language, sometimes going so far as to suppress signing. Sacks and many deaf folk think this has been a disaster for deaf people.

The answer to your question is now obvious. In what language do the profoundly deaf think? Why, in Sign (or the local equivalent), assuming they were fortunate enough to have learned it in infancy. The hearing can have only a general idea what this is like--the gulf between spoken and visual language is far greater than that between, say, English and Russian. Research suggests that the brain of a native deaf signer is organized differently from that of a hearing person. Still, sometimes we can get a glimpse. Sacks writes of a visit to the island of Martha's Vineyard, where hereditary deafness was endemic for more than 250 years and a community of signers, most of whom hear normally, still flourishes. He met a woman in her 90s who would sometimes slip into a reverie, her hands moving constantly. According to her daughter, she was thinking in Sign. "Even in sleep, I was further informed, the old lady might sketch fragmentary signs on the counterpane," Sacks writes. "She was dreaming in Sign."

BioWizard, Wed, 13th Apr 2011

I am deaf myself, and left-handed/ambidextrous myself. Both halves of the brain are involved in language. However, with sign language being less concise than spoken language, I would think that the right-brain is a bit more involved. The left brain picks up the "text" of the language, spoken or otherwise. The right brain processes the context, which involves expressions, body language, intonations, etc. Bottom line, I think Deaf people are just a bit more right-brained than their hearing counterparts. But both sides are definitely involved. There is really no difference on whether the lack of auditory input affects the location of the language centers. If we lack working ears, our eyes simply take over the job. The brain would simply redirect more of that information to the language centers on both halves of the brain. By the way, about the mention of "oralists." I got raised by one. My mother did manage to teach me spoken language, but discouraged sign language. I guess most of the time it's all right. At least until my hearing-aid breaks or I meet a non-verbal peer and lose the chance to get to know each other just because of language barriers. When my hearing-aid breaks or I get ear infections, I curse my oralist upbringing because I have no fall back to work with. My lip-reading might be good, but it's not good enough. The end result is me in a box of silence and language barriers. My mum made a grave mistake that haunts me now in spells and might become a permanent problem, should I lose the rest of my hearing altogether. (While I am profoundly deaf in both ears, I still have enough in my left ear to function somewhat with a hearing-aid.) Kharism, Tue, 18th Oct 2011

See the whole discussion | Make a comment


-
Not working please enable javascript
EPSRC
Powered by UKfast
STFC
Genetics Society
ipDTL