Spatial modulation of motor-sensory recalibration in early deaf individuals.
ABSTRACT: Audition dominates other senses in temporal processing, and in the absence of auditory cues, temporal perception can be compromised. Moreover, after auditory deprivation, visual attention is selectively enhanced for peripheral visual stimuli. In this study, we assessed whether early hearing loss affects motor-sensory recalibration, the ability to adjust the timing of an action and its sensory effect based on the recent experience. Early deaf participants and hearing controls were asked to discriminate the temporal order between a motor action (a keypress) and a visual stimulus (a white circle) before and after adaptation to a delay between the two events. To examine the effects of spatial modulation, we presented visual stimuli in both central and peripheral visual fields. Results showed overall higher temporal JNDs (Just Noticeable Difference) for deaf participants as compared to hearing controls suggesting that the auditory information is important for the calibration of motor-sensory timing. Adaptation to a motor-sensory delay induced distinctive effect in the two groups of participants, with hearing controls showing a recalibration effect for central stimuli only whereas deaf individuals for peripheral visual stimuli only. Our results suggest that auditory deprivation affects motor-sensory recalibration and that the mechanism underlying motor-sensory recalibration is susceptible to spatial modulation.
Project description:Sensory cortices of individuals who are congenitally deprived of a sense can exhibit considerable plasticity and be recruited to process information from the senses that remain intact. Here, we explored whether the auditory cortex of congenitally deaf individuals represents visual field location of a stimulus-a dimension that is represented in early visual areas. We used functional MRI to measure neural activity in auditory and visual cortices of congenitally deaf and hearing humans while they observed stimuli typically used for mapping visual field preferences in visual cortex. We found that the location of a visual stimulus can be successfully decoded from the patterns of neural activity in auditory cortex of congenitally deaf but not hearing individuals. This is particularly true for locations within the horizontal plane and within peripheral vision. These data show that the representations stored within neuroplastically changed auditory cortex can align with dimensions that are typically represented in visual cortex.
Project description:Perception of synchrony between one's own action (a finger tap) and the sensory feedback thereof (a visual flash or an auditory pip) can be recalibrated after exposure to an artificially inserted delay between them (temporal recalibration effect: TRE). TRE might be mediated by a compensatory shift of motor timing (when did I tap?) and/or the sensory timing of the feedback (when did I hear/see the feedback?). To examine this, we asked participants to voluntarily tap their index finger at a constant pace while receiving visual or auditory feedback (a flash or pip) that was either synced or somewhat delayed relative to the tap. Following this exposure phase, they then performed a simple reaction time (RT) task to measure the sensory timing of the exposure stimulus, and a sensorimotor synchronization (SMS) task (tapping in synchrony with a flash or pip as pacing stimulus) to measure the point of subjective synchrony between the tap and pacing stimulus. The results showed that after exposure to delayed auditory feedback, participants tapped earlier (~21.5 ms) relative to auditory pacing stimuli (= temporal recalibration) and reacted faster (~5.6 ms) to auditory stimuli. For visual exposure and test stimuli, there were no such compensatory effects. These results indicate that adjustments of audio-motor synchrony can to some extent be explained by a change in the speed of auditory sensory processing. We discuss this in terms of an attentional modulation of sensory processing.
Project description:Psychophysical and neuroimaging studies in both animal and human subjects have clearly demonstrated that cortical plasticity following sensory deprivation leads to a brain functional reorganization that favors the spared modalities. In postlingually deaf patients, the use of a cochlear implant (CI) allows a recovery of the auditory function, which will probably counteract the cortical crossmodal reorganization induced by hearing loss. To study the dynamics of such reversed crossmodal plasticity, we designed a longitudinal neuroimaging study involving the follow-up of 10 postlingually deaf adult CI users engaged in a visual speechreading task. While speechreading activates Broca's area in normally hearing subjects (NHS), the activity level elicited in this region in CI patients is abnormally low and increases progressively with post-implantation time. Furthermore, speechreading in CI patients induces abnormal crossmodal activations in right anterior regions of the superior temporal cortex normally devoted to processing human voice stimuli (temporal voice-sensitive areas-TVA). These abnormal activity levels diminish with post-implantation time and tend towards the levels observed in NHS. First, our study revealed that the neuroplasticity after cochlear implantation involves not only auditory but also visual and audiovisual speech processing networks. Second, our results suggest that during deafness, the functional links between cortical regions specialized in face and voice processing are reallocated to support speech-related visual processing through cross-modal reorganization. Such reorganization allows a more efficient audiovisual integration of speech after cochlear implantation. These compensatory sensory strategies are later completed by the progressive restoration of the visuo-audio-motor speech processing loop, including Broca's area.
Project description:Visual stimuli are known to activate the auditory cortex of deaf people, presenting evidence of cross-modal plasticity. However, the mechanisms underlying such plasticity are poorly understood. In this functional MRI study, we presented two types of visual stimuli, language stimuli (words, sign language, and lip-reading) and a general stimulus (checkerboard) to investigate neural reorganization in the superior temporal cortex (STC) of deaf subjects and hearing controls. We found that only in the deaf subjects, all visual stimuli activated the STC. The cross-modal activation induced by the checkerboard was mainly due to a sensory component via a feed-forward pathway from the thalamus and primary visual cortex, positively correlated with duration of deafness, indicating a consequence of pure sensory deprivation. In contrast, the STC activity evoked by language stimuli was functionally connected to both the visual cortex and the frontotemporal areas, which were highly correlated with the learning of sign language, suggesting a strong language component via a possible feedback modulation. While the sensory component exhibited specificity to features of a visual stimulus (e.g., selective to the form of words, bodies, or faces) and the language (semantic) component appeared to recruit a common frontotemporal neural network, the two components converged to the STC and caused plasticity with different multivoxel activity patterns. In summary, the present study showed plausible neural pathways for auditory reorganization and correlations of activations of the reorganized cortical areas with developmental factors and provided unique evidence towards the understanding of neural circuits involved in cross-modal plasticity.
Project description:Sensory substitution is a promising therapeutic approach for replacing a missing or diseased sensory organ by translating inaccessible information into another sensory modality. However, many substitution systems are not well accepted by subjects. To explore the effect of sensory substitution on voluntary action repertoires and their associated affective valence, we study deaf songbirds to which we provide visual feedback as a substitute of auditory feedback. Surprisingly, deaf birds respond appetitively to song-contingent binary visual stimuli. They skillfully adapt their songs to increase the rate of visual stimuli, showing that auditory feedback is not required for making targeted changes to vocal repertoires. We find that visually instructed song learning is basal-ganglia dependent. Because hearing birds respond aversively to the same visual stimuli, sensory substitution reveals a preference for actions that elicit sensory feedback over actions that do not, suggesting that substitution systems should be designed to exploit the drive to manipulate.
Project description:The neurodevelopmental consequences of deafness on the functional neuroarchitecture of the conceptual system have not been intensively investigated so far. Using functional magnetic resonance imaging (fMRI), we therefore identified brain areas involved in conceptual processing in deaf and hearing participants. Conceptual processing was probed by a pictorial animacy decision task. Furthermore, brain areas sensitive to observing verbal signs and to observing non-verbal visual hand actions were identified in deaf participants. In hearing participants, brain areas responsive to environmental sounds and the observation of visual hand actions were determined. We found a stronger recruitment of superior and middle temporal cortex in deaf compared to hearing participants during animacy decisions. This region, which forms auditory cortex in hearing people according to the sound listening task, was also activated in deaf participants, when they observed sign language, but not when they observed non-verbal hand actions. These results indicate that conceptual processing in deaf people more strongly depends on language representations compared to hearing people. Furthermore, additionally enhanced activation in visual and motor areas of deaf versus hearing participants during animacy decisions and a more frequent report of visual and motor features in the property listing task suggest that the loss of the auditory channel is partially compensated by an increased importance of visual and motor information for constituting object knowledge. Hence, our results indicate that conceptual processing in deaf compared to hearing people is more strongly based on the language system, complemented by an enhanced contribution of the visuo-motor system.
Project description:The ability to dance relies on the ability to synchronize movements to a perceived musical beat. Typically, beat synchronization is studied with auditory stimuli. However, in many typical social dancing situations, music can also be perceived as vibrations when objects that generate sounds also generate vibrations. This vibrotactile musical perception is of particular relevance for deaf people, who rely on non-auditory sensory information for dancing. In the present study, we investigated beat synchronization to vibrotactile electronic dance music in hearing and deaf people. We tested seven deaf and 14 hearing individuals on their ability to bounce in time with the tempo of vibrotactile stimuli (no sound) delivered through a vibrating platform. The corresponding auditory stimuli (no vibrations) were used in an additional condition in the hearing group. We collected movement data using a camera-based motion capture system and subjected it to a phase-locking analysis to assess synchronization quality. The vast majority of participants were able to precisely time their bounces to the vibrations, with no difference in performance between the two groups. In addition, we found higher performance for the auditory condition compared to the vibrotactile condition in the hearing group. Our results thus show that accurate tactile-motor synchronization in a dance-like context occurs regardless of auditory experience, though auditory-motor synchronization is of superior quality.
Project description:Sensory areas of the cerebral cortex integrate the sensory inputs with the ongoing activity. We studied how complete absence of auditory experience affects this process in a higher mammal model of complete sensory deprivation, the congenitally deaf cat. Cortical responses were elicited by intracochlear electric stimulation using cochlear implants in adult hearing controls and deaf cats. Additionally, in hearing controls, acoustic stimuli were used to assess the effect of stimulus mode (electric versus acoustic) on the cortical responses. We evaluated time-frequency representations of local field potential recorded simultaneously in the primary auditory cortex and a higher-order area, the posterior auditory field, known to be differentially involved in cross-modal (visual) reorganization in deaf cats. The results showed the appearance of evoked (phase-locked) responses at early latencies (<100 ms post-stimulus) and more abundant induced (non-phase-locked) responses at later latencies (>150 ms post-stimulus). In deaf cats, substantially reduced induced responses were observed in overall power as well as duration in both investigated fields. Additionally, a reduction of ongoing alpha band activity was found in the posterior auditory field (but not in primary auditory cortex) of deaf cats. The present study demonstrates that induced activity requires developmental experience and suggests that higher-order areas involved in the cross-modal reorganization show more auditory deficits than primary areas.
Project description:The brain is adaptive. The speed of propagation through air, and of low-level sensory processing, differs markedly between auditory and visual stimuli; yet the brain can adapt to compensate for the resulting cross-modal delays. Studies investigating temporal recalibration to audiovisual speech have used prolonged adaptation procedures, suggesting that adaptation is sluggish. Here, we show that adaptation to asynchronous audiovisual speech occurs rapidly. Participants viewed a brief clip of an actor pronouncing a single syllable. The voice was either advanced or delayed relative to the corresponding lip movements, and participants were asked to make a synchrony judgement. Although we did not use an explicit adaptation procedure, we demonstrate rapid recalibration based on a single audiovisual event. We find that the point of subjective simultaneity on each trial is highly contingent upon the modality order of the preceding trial. We find compelling evidence that rapid recalibration generalizes across different stimuli, and different actors. Finally, we demonstrate that rapid recalibration occurs even when auditory and visual events clearly belong to different actors. These results suggest that rapid temporal recalibration to audiovisual speech is primarily mediated by basic temporal factors, rather than higher-order factors such as perceived simultaneity and source identity.
Project description:Evidence of visual-auditory cross-modal plasticity in deaf individuals has been widely reported. Superior visual abilities of deaf individuals have been shown to result in enhanced reactivity to visual events and/or enhanced peripheral spatial attention. The goal of this study was to investigate the association between visual-auditory cross-modal plasticity and speech perception in post-lingually deafened, adult cochlear implant (CI) users. Post-lingually deafened adults with CIs (N = 14) and a group of normal hearing, adult controls (N = 12) participated in this study. The CI participants were divided into a good performer group (good CI, N = 7) and a poor performer group (poor CI, N = 7) based on word recognition scores. Visual evoked potentials (VEP) were recorded from the temporal and occipital cortex to assess reactivity. Visual field (VF) testing was used to assess spatial attention and Goldmann perimetry measures were analyzed to identify differences across groups in the VF. The association of the amplitude of the P1 VEP response over the right temporal or occipital cortex among three groups (control, good CI, poor CI) was analyzed. In addition, the association between VF by different stimuli and word perception score was evaluated. The P1 VEP amplitude recorded from the right temporal cortex was larger in the group of poorly performing CI users than the group of good performers. The P1 amplitude recorded from electrodes near the occipital cortex was smaller for the poor performing group. P1 VEP amplitude in right temporal lobe was negatively correlated with speech perception outcomes for the CI participants (r = -0.736, P = 0.003). However, P1 VEP amplitude measures recorded from near the occipital cortex had a positive correlation with speech perception outcome in the CI participants (r = 0.775, P = 0.001). In VF analysis, CI users showed narrowed central VF (VF to low intensity stimuli). However, their far peripheral VF (VF to high intensity stimuli) was not different from the controls. In addition, the extent of their central VF was positively correlated with speech perception outcome (r = 0.669, P = 0.009). Persistent visual activation in right temporal cortex even after CI causes negative effect on outcome in post-lingual deaf adults. We interpret these results to suggest that insufficient intra-modal (visual) compensation by the occipital cortex may cause negative effects on outcome. Based on our results, it appears that a narrowed central VF could help identify CI users with poor outcomes with their device.