Sign Language And The Brain Psychology Essay
Disclaimer: This essay has been submitted by a student. This is not an example of the work written by our professional essay writers. You can view samples of our professional work here.
Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UK Essays.
Sign language is a natural human language that employs the use of visual and spatial gestures to express meaning and communicate information. Similar to spoken languages which rely on aural-oral processing, signed languages exist as complex linguistic systems with fully developed grammatical components, including organisation on the levels of phonology, morphology, syntax and semantics (Hickok, 1996). The majority of studies on language processing have been previously accomplished from aural-oral languages, therefore research on signed languages can help illuminate the neural underpinnings of language in the brain, independent of the modality of the language itself.
1 Cerebral activity in sign language
1.2 Comparison between signed and spoken languages
2 Gestures versus sign language
3 Development and critical period
Cerebral activity in sign language
The identification of areas of cerebral activation during the production and comprehension of sign language has been the main focus of recent neurobiological sign language research. Studies involving neuroimaging and brain lesion analyses have highlighted similarities between activations in both spoken and signed languages, including the phenomenon of lateralisation and increased activity in specific language processing centres. On the other hand, brain regions specific to sign language processing have also been identified in several studies.
Left hemisphere lateralisation is generally seen in spoken language processing of right hand dominant individuals, where the right hemisphere is more involved with the prosody aspect of speech and the processing of visual-spatial information (Springer, 1999). In sign language, left hemisphere dominance is also observed despite its more spatial nature. Early brain lesion studies illustrated that left hemisphere damage to the frontal lobe lead to sign production impairments similar to Brocaâ€™s Aphasia, while damage to the left temporal cortex results in language understanding deficits. Right hemisphere lesions on the other hand, do not display any aphasic symptoms (Hickok, 1996; Marshall, 2004).
Recent studies involving neuroimaging techniques such as fMRI and PET have similarly shown left lateralisation in sign language perception, but also increased right hemisphere activity compared to spoken languages. In a landmark paper, Neville compared brain activity using fMRI, as signers were shown sentences in American Sign Language and spoken language users were shown written sentences (1999). In both cases, the left hemisphere perisylvian language areas were strongly recruited, however, only in sign language were corresponding regions in the right hemisphere also activated (Neville, 1999).
To address this discrepancy between the lesion and neuroimaging studies, Capek proposed that syntactic and semantic processing involve separate neural pathways in both signed and spoken languages (2009). On the basis of an ERP study, he concluded that semantic processing is similar in timing and place of activation in both language modalities. This indicates the possibility of neural systems that are involved in core natural language processing and are independent of modality. However, the study also found variations in syntactic processing, where a larger response occurred in the right hemisphere during signed verb agreement tasks in comparison to spoken. This agreed with the hypothesis that in the case for grammar, the modality of a language may influence its organisation in the brain. The use of the spatial environment in conveying grammar in signed languages activates the right hemisphere for spatial processing, as seen in the neuroimaging studies (Capek, 2009).
Comparison between signed and spoken languages
Cerebral activity in the left inferior temporal gyrus and posterior portion of the superior temporal gyrus are common between signed and spoken languages as seen in Figure 1 (Petitto, 2000; MacSweeney, 2002). The left inferior temporal gyrus is involved with the production of both spoken and signed languages, while the superior temporal gyrus, including Wernickeâ€™s area, is important in comprehension (MacSweeney, 2002). These areas are therefore thought to be involved with basic language processing, independent of what form the language takes on.
Figure 1. fMRI image of cerebral activity of deaf signers and hearing speakers when presented with British Sign Language or spoken language. The left inferior frontal cortex and left superior temporal gyrus show highest activity. (Image from MacSweeney, 2004)
On the other hand, differences between the two modalities of language have also been observed. In particular, the involvement of the superior parietal lobe and the supramarginal gyrus has been recorded only in sign language production (Emmorey, 2007). The superior parietal lobe is believed to be concerned with proprioceptive input during the motor production of signs. This allows signers to monitor their sign production in the same way speakers can monitor their spoken language using auditory input as they speak (Emmorey, 2007). The supramarginal gyrus, on the other hand, may be involved with phonological aspects of sign language production. Where phonemes are differentiated through different speech sounds in spoken language, phonological production in sign language involves variations in hand shapes and spatial locations of the sign. Stimulation of the left supramarginal gyrus has been shown to increase phonological errors in sign language production, such as the use of incorrect hand-shapes (Corina, 1999).
Gestures versus sign language
There is a common misconception that sign language is simply â€˜mimingâ€™. This is untrue given the linguistic complexities of sign languages, and also given recent lesion and neuroimaging studies that show the processing of gestures and sign languages as overlapping but distinct. In an fMRI study, MacSweeney compared cerebral activations as participants were asked to observe an individual using 1) British Sign Language (BSL) and 2) TicTac, a set of gestures used by horse racers to communicate (2004). Results of the study showed similar activations in the processing of both BSL and TicTac, however there was increased recruitment of the classical language processing areas during BSL observation. These areas included the inferior frontal lobe, superior temporal lobe and supramarginal gyrus (MacSweeney, 2004).
Development and critical period
The concept of a critical period in the development of certain biological and behavioural processes is held by many researchers and is believed to be correlated with brain plasticity (Mayberry, 2003). In particular, language acquisition is thought to be constrained by a specific time period in which language exposure is required for linguistic learning to occur. Debate regarding this theory is assisted by studies of late life sign language proficiency in deaf children born to hearing parents, as in such situations, individuals will often lack complete linguistic input during early childhood (Mayberry, 2003). Research has found that the performance of deaf adults without early childhood exposure (non-native signers) performed significantly worse in grammatical tasks compared to adults that had exposure (native signers) (Mayberry, 2002). This suggests the existence of a critical development period requiring neocortical specialisation and growth, similar to other biological processes such as the development of binocular vision (Mayberry, 2003).
The effects on neural structures due to lack of early language experience in individuals is seen to occur at the inferior frontal gyrus (MacSweeney, 2008). This area experiences greater activation in signers lacking early exposure in comparison to native signers when asked to perform phonological tasks (MacSweeney, 2008).
Keywords: Sign language, lesion studies, inferior frontal cortex, lateralisation, superior temporal gyrus, superior parietal gyrus, supramarginal gyrus, critical period, spoken language, fMRI.
Cite This Essay
To export a reference to this article please select a referencing stye below: