Congenital deafness is often compensated by early sign language use leading to typical language development with corresponding neural underpinnings. However, deaf individuals are frequently reported to have poorer numerical abilities than hearing individuals and it is not known whether the underlying neuronal networks differ between groups. In the present study, adult deaf signers and hearing nonsigners performed a digit and letter order tasks, during functional magnetic resonance imaging. We found the neuronal networks recruited in the two tasks to be generally similar across groups, with significant activation in the dorsal visual stream for the letter order task, suggesting letter identification and position encoding. For the digit order task, no significant activation was found for either of the two groups. Region of interest analyses on parietal numerical processing regions revealed different patterns of activation across groups. Importantly, deaf signers showed significant activation in the right horizontal portion of the intraparietal sulcus for the digit order task, suggesting engagement of magnitude manipulation during numerical order processing in this group.
Numerical processing abilities are closely associated with mathematical success [
The study included 16 deaf adults (
Participant characteristics.
Age | Sex | Education | Raven | |||||
---|---|---|---|---|---|---|---|---|
SD | Range | Female/male | University |
SD | Range | |||
Deaf signers | 28.1 | 3.44 | 21–32 | 11/5 | 5 | 52.3 | 5.13 | 44–60 |
Hearing nonsigners | 28.5 | 4.78 | 22–37 | 12/5 | 5 | 54.7 | 4.04 | 45–59 |
Fifteen of the deaf participants were deaf from birth and one from the age of six months. All reported using Swedish Sign Language (SSL) daily as their primary language. Six were exposed to SSL from birth and the others before the age of two.
The prevalence of congenital deafness is around 1 in a thousand live births, and only 5% of congenitally deaf children are born into signing families. Thus, deaf early signers constitute a very small population. Further, many deaf signers are opting for cochlear implantation which is a counter indication for fMRI. The group of deaf participants in the present study is similar in size to, or larger than, those in many other studies (cf. 11 deaf participants in Emmorey et al.’s study [
All participants gave written informed consent and were compensated for time and travel expenses. Approval was obtained from the regional ethical review board in Linköping, Sweden (Dnr 190/05).
Stimuli were identical across tasks and the control condition. They consisted of sets of three-digit/letter pairs, e.g., V2 X5 U7. The pairs included the digits 0–9 and the letters B, D, E, G, H, K, L, M, O, P, Q, T, U, V, X, and Z, as well as the characters Å and Ö that are listed at the end of the Swedish alphabet. There were 20 unique sets of pairs. Each pair was also reversed within each set, e.g., 2V 5X 7U, giving 40 unique stimuli. It is important to note that the digit/letter order within pairs was never mixed within stimuli. Further, congruent (the same correct response for both tasks) and noncongruent (different correct responses for the two tasks) trials were balanced. Participants completed six different tasks, of which three are investigated in the present study. Those three tasks were digit order (“are the presented digits in an ascending numerical order?”), letter order (“are the presented letters in an alphabetical order?”), and visual control task (“are there two dots over any of the presented letters?”). Correct responses were 50% “yes” and 50% “no,” distributed orthogonally across conditions. Results from the three remaining tasks (multiplication, subtraction, and phonological similarity) are reported in two articles (Andin et al. [
All participants took part in a behavioural testing session at least one month prior to the fMRI session for task familiarization and to ensure compliance during scanning. Before entering the scanner, participants practiced the tasks again and were instructed to respond as accurately and as quickly as possible during the presentation of each trial, by pressing one of two buttons using their right thumb and index finger. A professional accredited sign language interpreter provided deaf participants with a verbatim translation of test instructions and remained on hand to relay questions and answers. When participants were installed in the scanner, instructions were repeated again, orally for hearing individuals and as text on the screen for the deaf participants.
In the scanner, participants viewed the screen through an angled mirror on top of the head coil. Stimuli were presented using the Presentation software (Presentation version 10.2, Neurobehavioral Systems Inc., Albany, CA) and back projected onto a screen positioned at the feet of the participant. Each trial started with a 1000 ms period during which a cue displayed on the screen indicated which task was to be performed next. The cues were “1 2 3” for digit order, “a b c” for letter order, and “..” for the control task. After the cue, the stimulus was displayed for 4000 ms while the participant responded. Task presentation was blocked, and there were five trials per block. Thus, each block lasted for 25,000 ms. Between blocks, there was a 5000 ms break and a ¤ symbol was presented. Participants were instructed to move as little as possible. In total, there were 4 runs with 12 blocks in each. Of the 12 blocks, six blocks (two per condition) were considered in the present analysis.
Functional gradient-echo EPI images (
Data quality was checked using TSDiffAna (Freiburg Brain Imaging). As a result, the first run was discarded for three deaf participants and one hearing participant who moved more than 3 mm in at least one direction. Remaining data was preprocessed and analysed using statistical parametric mapping software (SPM8; Wellcome Trust Centre for Neuroimaging, London, UK) running under MatLab r2010a (MathWorks Inc., Natick, MA, USA). Preprocessing included realignment, coregistration, normalization to the MNI152 template, and spatial smoothing using a 10 mm FWHM Gaussian kernel, following standard SPM8 procedures.
Blocks with more than two incorrect answers were discarded from the analysis (two-letter order blocks and one-digit order block were removed from the hearing group, and two-letter order blocks were removed from the deaf group), because the response pattern in some cases suggested nonadherence to the task. Data from one hearing participant were removed due to artefacts probably caused by metallic hair dye. Thus, data from 16 participants in each group were included in the functional analysis. Analysis was conducted by fitting a general linear model (GLM) with regressors representing each of the two experimental conditions of interest here (digit order and letter order) and the visual control, as well as the six motion parameters derived from the realignment procedure. At the first-level analysis, contrast images consisting of digit order versus visual control and letter order versus visual control were defined individually for each participant. To investigate hypothesis 1 that there will be general similarities between groups for both the digit and the letter order task at the whole brain level, the contrast images from the first level analysis were brought into second-level analyses where sample
Hypotheses 2–4 were investigated using separate region of interest analyses (ROI) for the left angular gyrus (lAG), left and right superior parietal lobules (lSPL and rSPL), and left and right horizontal portions of the intraparietal sulcus (lHIPS and rHIPS; using the toolbox MarsBar, release 0.44). The ROIs were defined in accordance with the probabilistic cytoarchitectonic maps from an SPM anatomy toolbox (version 1.8). To investigate hypothesis 2 that for the digit order task there will be activation in all five ROIs for both groups, group level contrasts were obtained to determine significant activation within the five ROIs. Further, to investigate the final two hypotheses that activation for the digit order task will be greater than that for the letter order task in hIPS (hypothesis 3) and that hearing nonsigners will show greater activation in the lAG compared to deaf signers for both digit order and letter order (hypothesis 4), individual contrast values from the HIPS and lAG of the ROI analysis were extracted for further statistical analyses. These analyses were carried out as a
Behavioural data are shown in Table
Behavioural in-scanner data.
Response time (ms) | Accuracy (% correct) | |||||||
---|---|---|---|---|---|---|---|---|
Deaf signers | Hearing nonsigners | Deaf signers | Hearing nonsigners | |||||
SD | SD | SD | SD | |||||
Digit order | 1612 | 192 | 1603 | 314 | 96.0 | 6.87 | 98.6 | 1.86 |
Letter order | 2345 | 327 | 2342 | 336 | 91.6 | 5.13 | 91.2 | 3.63 |
As with the response time data, accuracy data revealed a main effect of task (
The results of the whole brain analysis to test hypothesis 1 are shown in Table
Whole brain analysis. Activation foci for each contrast versus visual control for the two groups separately and combined. All peaks with significant activation are listed (
Group | Task | Cluster level | Peak level | MNI coordinates | Brain region of the peak | ||||
---|---|---|---|---|---|---|---|---|---|
Size | |||||||||
Deaf signers | Letter order | 89 | 0.016 | 5.19 | 0.342 | 23 | −72 | 44 | r. superior occipital gyrus |
Hearing nonsigners | Letter order | 42 | <0.001 | 9.10 | 0.004 | 16 | −72 | 54 | r. superior parietal lobule |
8.01 | 0.016 | 30 | −72 | 39 | r. middle occipital gyrus | ||||
18 | <0.001 | 8.90 | 0.005 | 30 | 9 | 59 | r. middle frontal gyrus | ||
14 | <0.001 | 8.89 | 0.005 | −12 | −75 | 49 | l. precuneus | ||
7.30 | 0.043 | −26 | −68 | 54 | l. superior parietal lobule | ||||
6 | 0.001 | 8.37 | 0.010 | −30 | −79 | 34 | l. middle occipital gyrus | ||
4 | 0.003 | 7.81 | 0.022 | −54 | −5 | 44 | l. postcentral gyrus | ||
Both groups combined | Digit order | 5 | 0.006 | 6.01 | 0.010 | 23 | −61 | −36 | r. cerebellum |
Both groups combined | Letter order | 137 | <0.001 | 8.06 | <0.001 | 23 | −72 | 44 | r. superior occipital gyrus |
7.87 | <0.001 | 34 | −68 | 34 | r. middle occipital gyrus | ||||
99 | <0.001 | 7.65 | <0.001 | −26 | −72 | 29 | l. middle occipital gyrus | ||
7.20 | 0.001 | −23 | −68 | 49 | l. superior parietal lobule | ||||
31 | <0.001 | 7.40 | <0.001 | 27 | 6 | 54 | r. superior frontal gyrus | ||
77 | <0.001 | 7.14 | 0.001 | −5 | 2 | 59 | l. SMA | ||
6.93 | 0.001 | −16 | 2 | 59 | l. superior frontal gyrus | ||||
6.69 | 0.002 | −5 | 16 | 44 | l. SMA | ||||
18 | <0.001 | 6.28 | 0.006 | 44 | −40 | 44 | r. inferior parietal sulcus | ||
2 | 0.015 | 5.58 | 0.030 | −44 | 2 | 29 | l. precentral gyrus | ||
1 | 0.023 | 5.39 | 0.045 | −44 | −47 | 44 | l. inferior parietal sulcus |
Because the groups show similar activation for both tasks, combining them might give additional information otherwise obscured by the low number of subjects. Therefore, the activation patterns for the two tasks were further investigated by collapsing across groups (cf. Mayer et al. [
Activation pattern for digit order (red) and letter order (blue) in (a) left and (b) right hemispheres for both groups combined. Images are thresholded at a FWE-corrected
To investigate our region-specific hypotheses, we analysed the variations in brain activity associated with the digit and letter order tasks in the two groups within the bilateral superior parietal lobule (SPL) and horizontal portion of the intraparietal sulcus (HIPS) as well as the left angular gyrus (lAG). In line with the second hypothesis, the digit order task significantly activated the rHIPS for deaf signers (
Regarding our third hypothesis, we performed a mixed design
Finally, we did not find support for our fourth hypothesis of a significant difference between groups on the digit order task within the lAG (
The main purpose of the present study was to investigate neuronal networks for order processing in deaf and hearing individuals. We predicted general similarities across groups for both the digit and letter order tasks with some language modality-specific activation. Specifically, we hypothesized (1) general similar activation across groups at whole brain level, (2) significant activation for the digit order task in regions of interest in the parietal cortex across groups, (3) significantly greater activation for digit order compared to letter order in HIPS, reflecting magnitude specificity, and (4) significant group differences in the lAG for the digit order task, reflecting differential engagement of linguistic representations across groups. The results showed that there were general similarities across groups in relation to both task and regions but that none of our hypotheses was fully supported. Overall and in line with the previous studies [
We predicted that the digit order task would activate bilateral parietal regions including the superior parietal cortex and the horizontal portion of the intraparietal sulcus in both groups. However, in the whole brain analysis, there was no evidence for either group of a general activation in bilateral parietal regions that has previously been attested for numerical ordering tasks. In fact, the digit order task versus visual control only elicited significant activation in the right cerebellum, when collapsed over groups. A meta-analysis conducted by Arsalidou and Taylor [
In the analysis performed on the five regions of interest based on our hypotheses, the only region to be significantly activated for the digit order task was the rHIPS for deaf signers. For hearing nonsigners, no significant activation was found. Hence, for the hearing group, we found no support for the notion of magnitude manipulation specificity of the horizontal portion of the intraparietal sulcus. Several studies [
In both the whole brain analysis and the ROI analyses, we found significant activation across groups for the letter order task within the visual processing system which not only included occipital regions but also extended into the parietal and frontal regions of the dorsal stream. This is in line with recent work showing more activation for letters than numbers in the left inferior and superior parietal gyri as well as a preferential role of the parietal cortex for letter identity and letter position encoding [
Interestingly, in the ROI analyses, the lSPL was significantly activated in hearing nonsigners, corroborating findings from previous studies suggesting the lSPL to have a central role in letter positioning [
Finally, we predicted that differential engagement of linguistic representations by deaf signers and hearing nonsigners during the digit order task would be reflected in differences in the activation of the lAG. Surprisingly, there was no significant activation in the lAG for either group and no significant differences in activation of this region related to either task or group. This region has been shown to be involved in verbal number processing, such as multiplication and simple subtraction [
The main finding of the present study is that there are similarities in the recruitment of neuronal networks during order processing in deaf signers and hearing nonsigners. The digit order task showed relatively little activation across groups possibly relating to the simplicity of the task. However, recruitment of the rHIPS in deaf signers only for this task suggests that compared to hearing nonsigners this group makes use of qualitatively different processes, such as magnitude manipulation for number order processing. Extensive activation of the dorsal stream relating to the letter order task indicates a prominent role for letter identification and position encoding. This finding prompts further investigation of the effects of deafness and sign language use on the neural networks underpinning core arithmetic processes.
The data used to support the findings of this study are available from the corresponding author upon request.
There are no conflicts of interest to report.
The work was supported by funding from the Swedish Research Council (grant number 2005-1353). The authors would like to thank Shahram Moradi for the technical assistance, Örjan Dahlström for the statistical advice, and Lena Davidsson for the sign language interpretation.