The neurophysiology of human biological motion processing: a high-density electrical mapping study.
ABSTRACT: The neural processing of biological motion (BM) is of profound experimental interest since it is often through the movement of another that we interpret their immediate intentions. Neuroimaging points to a specialized cortical network for processing biological motion. Here, high-density electrical mapping and source-analysis techniques were employed to interrogate the timing of information processing across this network. Participants viewed point-light-displays depicting standard body movements (e.g. jumping), while event-related potentials (ERPs) were recorded and compared to ERPs to scrambled motion control stimuli. In a pair of experiments, three major phases of BM-specific processing were identified: 1) The earliest phase of BM-sensitive modulation was characterized by a positive shift of the ERP between 100 and 200 ms after stimulus onset. This modulation was observed exclusively over the right hemisphere and source-analysis suggested a likely generator in close proximity to regions associated with general motion processing (KO/hMT). 2) The second phase of BM-sensitivity occurred from 200 to 350 ms, characterized by a robust negative-going ERP modulation over posterior middle temporal regions bilaterally. Source-analysis pointed to bilateral generators at or near the posterior superior temporal sulcus (STS). 3) A third phase of processing was evident only in our second experiment, where participants actively attended the BM aspect of the stimuli, and was manifest as a centro-parietal positive ERP deflection, likely related to later cognitive processes. These results point to very early sensory registration of biological motion, and highlight the interactive role of the posterior STS in analyzing the movements of other living organisms.
Project description:Event-related potential (ERP) studies have provided evidence for an allocation of attentional resources to enhance perceptual processing of motivationally salient stimuli. Emotional modulation affects several consecutive components associated with stages of affective-cognitive processing, beginning as early as 100-200ms after stimulus onset. In agreement with the notion that the right parietotemporal region is critically involved during the perception of arousing affective stimuli, some ERP studies have reported asymmetric emotional ERP effects. However, it is difficult to separate emotional from non-emotional effects because differences in stimulus content unrelated to affective salience or task demands may also be associated with lateralized function or promote cognitive processing. Other concerns pertain to the operational definition and statistical independence of ERP component measures, their dependence on an EEG reference, and spatial smearing due to volume conduction, all of which impede the identification of distinct scalp activation patterns associated with affective processing. Building on prior research using a visual half-field paradigm with highly controlled emotional stimuli (pictures of cosmetic surgery patients showing disordered [negative] or healed [neutral] facial areas before or after treatment), 72-channel ERPs recorded from 152 individuals (ages 13-68years; 81 female) were transformed into reference-free current source density (CSD) waveforms and submitted to temporal principal components analysis (PCA) to identify their underlying neuronal generator patterns. Using both nonparametric randomization tests and repeated measures ANOVA, robust effects of emotional content were found over parietooccipital regions for CSD factors corresponding to N2 sink (212ms peak latency), P3 source (385ms) and a late centroparietal source (630ms), all indicative of greater positivity for negative than neutral stimuli. For the N2 sink, emotional effects were right-lateralized and modulated by hemifield, with larger amplitude and asymmetry for left hemifield (right hemisphere) presentations. For all three factors, more positive amplitudes at parietooccipital sites were associated with increased ratings of negative valence and greater arousal. Distributed inverse solutions of the CSD-PCA-based emotional effects implicated a sequence of maximal activations in right occipitotemporal cortex, bilateral posterior cingulate cortex, and bilateral inferior temporal cortex. These findings are consistent with hierarchical activations of the ventral visual pathway reflecting subsequent processing stages in response to motivationally salient stimuli.
Project description:The N170 ERP component has been widely identified as a face-sensitive neural marker. Despite extensive investigations conducted to examine the neural sources of N170, there are two issues in prior literature: (a) few studies used individualized anatomy as head model for the cortical source analysis of the N170, and (b) the relationship between the N170 and face-selective regions from fMRI studies is unclear. Here, we addressed these questions by presenting pictures of faces and houses to the same group of healthy adults and recording structural MRI, fMRI, and high-density ERPs in separate sessions. Source analysis based on the participant's anatomy showed that the middle and posterior fusiform gyri were the primary neural sources for the face-sensitive aspects of the N170. Source analysis based on regions of interest from the fMRI revealed that the fMRI-defined fusiform face area was the major contributor to the N170. The current study suggests that the fusiform gyrus is a major neural contributor to the N170 ERP component and provides further insights about the spatiotemporal characteristics of face processing.
Project description:The abilities of infants to perceive basic acoustic differences, essential for language development, can be studied using auditory event-related potentials (ERPs). However, scalp-channel averaged ERPs sum volume-conducted contributions from many cortical areas, reducing the functional specificity and interpretability of channel-based ERP measures. This study represents the first attempt to investigate rapid auditory processing in infancy using independent component analysis (ICA), allowing exploration of source-resolved ERP dynamics and identification of ERP cortical generators. Here, we recorded 60-channel EEG data in 34 typically developing 6-month-old infants during a passive acoustic oddball paradigm presenting 'standard' tones interspersed with frequency- or duration-deviant tones. ICA decomposition was applied to single-subject EEG data. The best-fitting equivalent dipole or bilaterally symmetric dipole pair was then estimated for each resulting independent component (IC) process using a four-layer infant head model. Similar brain-source ICs were clustered across subjects. Results showed ERP contributions from auditory cortex and multiple extra-auditory cortical areas (often, bilaterally paired). Different cortical source combinations contributed to the frequency- and duration-deviant ERP peak sequences. For ICs in an ERP-dominant source cluster located in or near the mid-cingulate cortex, source-resolved frequency-deviant response N2 latency and P3 amplitude at 6 months-of-age predicted vocabulary size at 20 months-of-age. The same measures for scalp channel F6 (though not for other frontal channels) showed similar but weaker correlations. These results demonstrate the significant potential of ICA analyses to facilitate a deeper understanding of the neural substrates of infant sensory processing.
Project description:The aim of this study was to examine specialized face processing in forty-eight 4.5- to 7.5-month-old infants by recording event-related potentials (ERPs) in response to faces and toys, and to determine the cortical sources of these signals using realistic, age-appropriate head models. All ERP components (i.e., N290, P400, Nc) showed greater amplitude during periods of attention than inattention. Amplitude was greater to faces than toys during attention at the N290, and greater to toys at the P400. Cortical source analysis revealed activity in occipital-temporal brain areas as the source of the N290, particularly the middle fusiform gyrus. The Nc and P400 were the result of activation in midline frontal and parietal, anterior temporal, and posterior temporal and occipital brain areas.
Project description:In an everyday social interaction we automatically integrate another's facial movements and vocalizations, be they linguistic or otherwise. This requires audiovisual integration of a continual barrage of sensory input-a phenomenon previously well-studied with human audiovisual speech, but not with non-verbal vocalizations. Using both fMRI and ERPs, we assessed neural activity to viewing and listening to an animated female face producing non-verbal, human vocalizations (i.e. coughing, sneezing) under audio-only (AUD), visual-only (VIS) and audiovisual (AV) stimulus conditions, alternating with Rest (R). Underadditive effects occurred in regions dominant for sensory processing, which showed AV activation greater than the dominant modality alone. Right posterior temporal and parietal regions showed an AV maximum in which AV activation was greater than either modality alone, but not greater than the sum of the unisensory conditions. Other frontal and parietal regions showed Common-activation in which AV activation was the same as one or both unisensory conditions. ERP data showed an early superadditive effect (AV > AUD + VIS, no rest), mid-range underadditive effects for auditory N140 and face-sensitive N170, and late AV maximum and common-activation effects. Based on convergence between fMRI and ERP data, we propose a mechanism where a multisensory stimulus may be signaled or facilitated as early as 60 ms and facilitated in sensory-specific regions by increasing processing speed (at N170) and efficiency (decreasing amplitude in auditory and face-sensitive cortical activation and ERPs). Finally, higher-order processes are also altered, but in a more complex fashion.
Project description:The Implicit Association Test (IAT) is a reaction time based categorization task that measures the differential associative strength between bipolar targets and evaluative attribute concepts as an approach to indexing implicit beliefs or biases. An open question exists as to what exactly the IAT measures, and here EEG (Electroencephalography) has been used to investigate the time course of ERPs (Event-related Potential) indices and implicated brain regions in the IAT. IAT-EEG research identifies a number of early (250-450 ms) negative ERPs indexing early-(pre-response) processing stages of the IAT. ERP activity in this time range is known to index processes related to cognitive control and semantic processing. A central focus of these efforts has been to use IAT-ERPs to delineate the implicit and explicit factors contributing to measured IAT effects. Increasing evidence indicates that cognitive control (and related top-down modulation of attention/perceptual processing) may be components in the effective measurement of IAT effects, as factors such as physical setting or task instruction can change an IAT measurement. In this study we further implicate the role of proactive cognitive control and top-down modulation of attention/perceptual processing in the IAT-EEG. We find statistically significant relationships between D-score (a reaction-time based measure of the IAT-effect) and early ERP-time windows, indicating where more rapid word categorizations driving the IAT effect are present, they are at least partly explainable by neural activity not significantly correlated with the IAT measurement itself. Using LORETA, we identify a number of brain regions driving these ERP-IAT relationships notably involving left-temporal, insular, cingulate, medial frontal and parietal cortex in time regions corresponding to the N2- and P3-related activity. The identified brain regions involved with reduced reaction times on congruent blocks coincide with those of previous studies.
Project description:Event-related potential (ERP) and other functional imaging studies often demonstrate age-related increases in anterior neural activity and decreases in posterior activity while subjects carry out task demands. It remains unclear whether this "anterior shift" is limited to late cognitive operations like those indexed by the P3 component, or is evident during other stages of information processing. The temporal resolution of ERPs provided an opportunity to address this issue. Temporospatial principal component analysis (PCA) was used to identify underlying components that may be obscured by overlapping ERP waveforms. ERPs were measured during a visual oddball task in 26 young, 26 middle-aged, and 29 old subjects who were well-matched for IQ, executive function, education, and task performance. PCA identified six anterior factors peaking between ?140 ms and 810 ms, and four posterior factors peaking between ?300 ms and 810 ms. There was an age-related increase in the amplitude of anterior factors between ?200 and 500 ms, and an age-associated decrease in amplitude of posterior factors after ?500 ms. The increase in anterior processing began as early as middle-age, was sustained throughout old age, and appeared to be linear in nature. These results suggest that age-associated increases in anterior activity occur after early sensory processing has taken place, and are most prominent during a period in which attention is being marshaled to evaluate a stimulus. In contrast, age-related decreases in posterior activity manifest during operations involved in stimulus categorization, post-decision monitoring, and preparation for an upcoming event.
Project description:The perception of actions underwrites a wide range of socio-cognitive functions. Previous neuroimaging and lesion studies identified several components of the brain network for visual biological motion (BM) processing, but interactions among these components and their relationship to behavior remain little understood. Here, using a recently developed integrative analysis of structural and effective connectivity derived from high angular resolution diffusion imaging (HARDI) and functional magnetic resonance imaging (fMRI), we assess the cerebro-cerebellar network for processing of camouflaged point-light BM. Dynamic causal modeling (DCM) informed by probabilistic tractography indicates that the right superior temporal sulcus (STS) serves as an integrator within the temporal module. However, the STS does not appear to be a "gatekeeper" in the functional integration of the occipito-temporal and frontal regions: The fusiform gyrus (FFG) and middle temporal cortex (MTC) are also connected to the right inferior frontal gyrus (IFG) and insula, indicating multiple parallel pathways. BM-specific loops of effective connectivity are seen between the left lateral cerebellar lobule Crus I and right STS, as well as between the left Crus I and right insula. The prevalence of a structural pathway between the FFG and STS is associated with better BM detection. Moreover, a canonical variate analysis shows that the visual sensitivity to BM is best predicted by BM-specific effective connectivity from the FFG to STS and from the IFG, insula, and STS to the early visual cortex. Overall, the study characterizes the architecture of the cerebro-cerebellar network for BM processing and offers prospects for assessing the social brain.
Project description:Event-related potentials (ERPs) are used extensively to investigate the neural mechanisms of attention control and selection. The univariate ERP approach, however, has left important questions inadequately answered. We addressed two questions by applying multivariate pattern classification to multichannel ERPs in two cued visual spatial attention experiments (N =?56): (a) impact of cueing strategies (instructional vs. probabilistic) on attention control and selection and (b) neural and behavioral effects of individual differences. Following cue onset, the decoding accuracy (cue left vs. cue right) began to rise above chance level earlier and remained higher in instructional cueing (~80?ms) than in probabilistic cueing (~160?ms), suggesting that unilateral attention focus leads to earlier and more distinct formation of the attention control set. A similar temporal sequence was also found for target-related processing (cued target vs. uncued target), suggesting earlier and stronger attention selection under instructional cueing. Across the two experiments: (a) individuals with higher cue-related decoding accuracy showed higher magnitude of attentional modulation of target-evoked N1 amplitude, suggesting that better formation of anticipatory attentional state leads to stronger modulation of target processing, and (b) individuals with higher target-related decoding accuracy showed faster reaction times (or larger cueing effects), suggesting that stronger selection of task-relevant information leads to better behavioral performance. Taken together, multichannel ERPs combined with machine learning decoding yields new insights into attention control and selection that complement the univariate ERP approach, and along with the univariate ERP approach, provides a more comprehensive methodology to the study of visual spatial attention.
Project description:Biases in attention towards facial cues during infancy may have an important role in the development of social brain networks. The current study used a longitudinal design to examine the stability of infants' attentional biases towards facial expressions and to elucidate how these biases relate to emerging cortical sensitivity to facial expressions. Event-related potential (ERP) and attention disengagement data were acquired in response to the presentation of fearful, happy, neutral, and phase-scrambled face stimuli from the same infants at 5 and 7 months of age. The tendency to disengage from faces was highly consistent across both ages. However, the modulation of this behavior by fearful facial expressions was uncorrelated between 5 and 7 months. In the ERP data, fear-sensitive activity was observed over posterior scalp regions, starting at the latency of the N290 wave. The scalp distribution of this sensitivity to fear in ERPs was dissociable from the topography of face-sensitive modulation within the same latency range. While attentional bias scores were independent of co-registered ERPs, attention bias towards fearful faces at 5 months of age predicted the fear-sensitivity in ERPs at 7 months of age. The current results suggest that the attention bias towards fear could be involved in the developmental tuning of cortical networks for social signals of emotion.