One of the greatest challenges of science is to better understand the complex functioning of the brain. The latest technologies for visualizing brain activity are helping to conduct research on how the brain processes Sign Languages, and their results are revealing new insights that some people already believe are boosting the third revolution in neuroscience.
Your Brain Doesn't Care If You Speak or Sign
Scientists believed that the areas of the brain responsible for speech were primarily Broca's area, for speech production, and Wernicke's area, for speech comprehension.However, the latest research with Sign Language is showing that these areas are also activated by the use of both speech and Sign Language. That is, according to the findings of Karen Emmorey, a researcher at San Diego State University, these areas seem to be related to human communication, regardless of whether it is spoken or signed.
The same results had already been achieved years earlier, in 2000, by Laura Ann Petitto, a researcher currently working at Gallaudet University, using positron emission tomography, a technology used in nuclear medicine.
In contrast, Sign Language activates the visual processing and motor control areas of the hands in the brain. In addition, when Sign Language is used to express spatial relationships, areas of the brain related to spatial processing and body awareness are activated, which is not the case with the use of spoken language.
Sign Language Is Not Pantomime
In case there was any doubt about the linguistic nature of Sign Language, another 2011 study, also led by researcher Karen Emmorey, examined the areas of the brain that are activated by pantomime. In this research, they showed objects such as a broom or a hammer to Deaf and hearing people and asked them to mime how they would use them. Then they asked the Deaf people to express in Sign Language the verb associated with each object.The result was that, curiously, both Deaf and hearing people activated the same area of the brain to mime: the superior parietal cortex, associated with grasping, and not the areas of language.
But the most surprising thing was that in Deaf people when they mime, one area of the brain was activated, and when they used Sign Language verbs, the activated area of the brain was different. This happened even when both forms of expression could coincide. It is well known that many Deaf signers has higher pantomime skills but if Sign Language was a pantomime, mime would have activated the same area in the brain as Sign Language and it was not. Conclussion: Sign Language as a linguistic system of communication has no relation to pantomime.
In the sign language verbs in images A and B, areas of the brain other than the mime are activated representing the action of eating with a fork in image C (photos: Emmorey et al., 2011) |
How Sign Language Helps to Understand the Brain
This research is important for two reasons: first, because it tells us that Broca's and Wernicke's areas not only process speech and hearing but are areas of language processing in general, that is, of the human ability to communicate (either for spoken languages or for signed languages). And, secondly, because they confirm that Sign Languages are as complex as spoken languages, which strengthens their human value and their usefulness for anyone, not just deaf people.The impact of this research has been very important in general linguistics and neurolinguistics. Not for nothing, the Spanish linguist Ángel Herrero, author of the first and only didactic grammar of Spanish Sign Language, remembers an anecdote:
Three years ago, at a congress held in Rome in which the aim was to compare Sign Languages and spoken languages from a linguistic point of view, the great Italian linguist Raffaele Simone, commenting on the intellectual impact that knowledge of Sign Languages had produced in him, declared with a sense of humour that until then he had considered himself a general linguist, and that he now understood that he was only a "half general linguist". This comment reflects exactly the feeling of linguists and the importance for general linguistics of the "discovery" of Sign Languages as an object of study (Ángel Herrero, 2007)
What Language Does a Deaf Person Think?
When a deaf person thinks, does he or she think in Sign Language or in a spoken language? Which language do a bilingual Deaf person who is fluent in both Sign Language and one or more spoken languages think in?It is difficult to say but there is some research on profoundly Deaf people who did not learn Sign Language and only know one spoken language: their brains never develop an "inner voice" to help them process information and therefore they have difficulties in processing complex and abstract tasks.
Sources:
- Emmorey, K., McCullough, S., Mehta, S., Ponto, L. L., & Grabowski, T. J. (2011). Sign language and pantomime production differentially engage frontal and parietal cortices. Language and cognitive processes, 26(7), 878-901.
- Emmorey, K., Mehta, S., & Grabowski, T. J. (2007). The neural correlates of sign versus word production. Neuroimage, 36(1), 202-208.
- EurekAlert! (2016, November 7). How human brains do language: 1 system, 2 channels. Retrieved from https://www.eurekalert.org/pub_releases/2016-11/nuco-hhb110616.php
- Hickok, G., Bellugi, U., & Klima, E. S. (2001). Sign language in the brain. Scientific American, 284(6), 58-65.
- Hiskey, D. (2010, July 20). How deaf people think. In Today I Found Out. Retrieved from http://www.todayifoundout.com/index.php/2010/07/how-deaf-people-think/
- Li, Q., & Xia, S. (2009, October). An fMRI study of Chinese sign language in functional cortex of prelingual deaf signers. In 2009 2nd International Congress on Image and Signal Processing (pp. 1-6). IEEE.
- MacSweeney, M., Woll, B., Campbell, R., McGuire, P. K., David, A. S., Williams, S. C., ... & Brammer, M. J. (2002). Neural systems underlying British Sign Language and audio‐visual English processing in native users. Brain, 125(7), 1583-1593. Retrieved from https://academic.oup.com/brain/article/125/7/1583/409327
- Moskowitz, C. (2010, February 26). "Same Brain Spots Handle Sign Language and Speaking". En Live Science. Retrieved from http://www.livescience.com/10628-brain-spots-handle-sign-language-speaking.html
- Petitto, L. A., Zatorre, R. J., Gauna, K., Nikelski, E. J., Dostie, D., & Evans, A. C. (2000). Speech-like cerebral activity in profoundly deaf people processing signed languages: implications for the neural basis of human language. Proceedings of the National Academy of Sciences, 97(25), 13961-13966. Retrieved from https://www.pnas.org/content/97/25/13961.full
- Petitto, L. A. (2000). The acquisition of natural signed languages: Lessons in the nature of human language and its biological foundations. Language acquisition by eye, 41-50.
- Suri, S (2014, July 26). What sign language teaches us about the brain. In The Epoch Times. Retrieved from https://www.theepochtimes.com/what-sign-language-teaches-us-about-the-brain_816273.html
- Szwed, M., Bola, Ł., & Zimmermann, M. (2017). Whether the hearing brain hears it or the deaf brain sees it, it’s just the same. Proceedings of the National Academy of Sciences, 114(31), 8135-8137. Retrieved from https://www.pnas.org/content/114/31/8135