Emotional attention capture by facial expressions.
ABSTRACT: Previous studies have shown that emotional facial expressions capture visual attention. However, it has been unclear whether attentional modulation is attributable to their emotional significance or to their visual features. We investigated this issue using a spatial cueing paradigm in which non-predictive cues were peripherally presented before the target was presented in either the same (valid trial) or the opposite (invalid trial) location. The target was an open dot and the cues were photographs of normal emotional facial expressions of anger and happiness, their anti-expressions and neutral expressions. Anti-expressions contained the amount of visual changes equivalent to normal emotional expressions compared with neutral expressions, but they were usually perceived as emotionally neutral. The participants were asked to localize the target as soon as possible. After the cueing task, they evaluated their subjective emotional experiences to the cue stimuli. Compared with anti-expressions, the normal emotional expressions decreased and increased the reaction times (RTs) in the valid and invalid trials, respectively. Shorter RTs in the valid trials and longer RTs in the invalid trials were related to higher subjective arousal ratings. These results suggest that emotional facial expressions accelerate attentional engagement and prolong attentional disengagement due to their emotional significance.
Project description:<h4>Background</h4>Previous studies have shown that females and males differ in the processing of emotional facial expressions including the recognition of emotion, and that emotional facial expressions are detected more rapidly than are neutral expressions. However, whether the sexes differ in the rapid detection of emotional facial expressions remains unclear.<h4>Methodology/principal findings</h4>We measured reaction times (RTs) during a visual search task in which 44 females and 46 males detected normal facial expressions of anger and happiness or their anti-expressions within crowds of neutral expressions. Anti-expressions expressed neutral emotions with visual changes quantitatively comparable to normal expressions. We also obtained subjective emotional ratings in response to the facial expression stimuli. RT results showed that both females and males detected normal expressions more rapidly than anti-expressions and normal-angry expressions more rapidly than normal-happy expressions. However, females and males showed different patterns in their subjective ratings in response to the facial expressions. Furthermore, sex differences were found in the relationships between subjective ratings and RTs. High arousal was more strongly associated with rapid detection of facial expressions in females, whereas negatively valenced feelings were more clearly associated with the rapid detection of facial expressions in males.<h4>Conclusion</h4>Our data suggest that females and males differ in their subjective emotional reactions to facial expressions and in the emotional processes that modulate the detection of facial expressions.
Project description:Emotional stimuli have been shown to modulate attentional orienting through signals sent by subcortical brain regions that modulate visual perception at early stages of processing. Fewer studies, however, have investigated a similar effect of emotional stimuli on attentional orienting in the auditory domain together with an investigation of brain regions underlying such attentional modulation, which is the general aim of the present study. Therefore, we used an original auditory dot-probe paradigm involving simultaneously presented neutral and angry non-speech vocal utterances lateralized to either the left or the right auditory space, immediately followed by a short and lateralized single sine wave tone presented in the same (valid trial) or in the opposite space as the preceding angry voice (invalid trial). Behavioral results showed an expected facilitation effect for target detection during valid trials while functional data showed greater activation in the middle and posterior superior temporal sulci (STS) and in the medial frontal cortex for valid vs. invalid trials. The use of reaction time facilitation [absolute value of the Z-score of valid-(invalid+neutral)] as a group covariate extended enhanced activity in the amygdalae, auditory thalamus, and visual cortex. Taken together, our results suggest the involvement of a large and distributed network of regions among which the STS, thalamus, and amygdala are crucial for the decoding of angry prosody, as well as for orienting and maintaining attention within an auditory space that was previously primed by a vocal emotional event.
Project description:Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.
Project description:Functional neuroimaging studies of episodic recognition demonstrate an increased lateral parietal response for studied versus new materials, often termed a retrieval success effect. Using a novel memory analog of attentional cueing, we manipulated the correspondence between anticipated and actual recognition evidence by presenting valid or invalid anticipatory cues (e.g., "likely old") before recognition judgments. Although a superior parietal region demonstrated the retrieval success pattern, a larger inferior parietal lobule (IPL) region tracked the validity of the memory cueing (invalid cueing > valid cueing) and no retrieval success-sensitive lateral parietal region was insensitive to cueing. The invalid cueing response occurred even for correctly identified new items unlikely to trigger substantive episodic retrieval. Within the IPL, although supramarginal and angular gyrus (SMG; AG) regions both demonstrated invalid cueing amplitude elevations, each region differentially coupled with distinct cortical networks when unexpectedly old items were encountered; a connectivity pattern also observed at rest in the same subjects. These findings jointly suggest that the lateral parietal response during recognition does not signify the recovery of episodic content, but is a marker of the violation of memory expectations. A second independent dataset confirmed this interpretation by demonstrating that SMG activation tracked the decision biases of observers, not their accuracy, with increased activation for nondominant recognition judgments. The expectancy violation interpretation of the lateral parietal recognition response is consistent with the literature on visual search and oddball paradigms and suggests that damage to these regions should impair memory-linked orienting behavior and not retrieval per se.
Project description:Distinct attentional mechanisms enhance the sensory processing of visual stimuli that appear at task-relevant locations and have task-relevant features. We used a combination of psychophysics and computational modeling to investigate how these two types of attention--spatial and feature based--interact to modulate sensitivity when combined in one task. Observers monitored overlapping groups of dots for a target change in color saturation, which they had to localize as being in the upper or lower visual hemifield. Pre-cues indicated the target's most likely location (left/right), color (red/green), or both location and color. We measured sensitivity (d') for every combination of the location cue and the color cue, each of which could be valid, neutral, or invalid. When three competing saturation changes occurred simultaneously with the target change, there was a clear interaction: The spatial cueing effect was strongest for the cued color, and the color cueing effect was strongest at the cued location. In a second experiment, only the target dot group changed saturation, such that stimulus competition was low. The resulting cueing effects were statistically independent and additive: The color cueing effect was equally strong at attended and unattended locations. We account for these data with a computational model in which spatial and feature-based attention independently modulate the gain of sensory responses, consistent with measurements of cortical activity. Multiple responses then compete via divisive normalization. Sufficient competition creates interactions between the two cueing effects, although the attentional systems are themselves independent. This model helps reconcile seemingly disparate behavioral and physiological findings.
Project description:Studies have revealed that catechol-O-methyltransferase (COMT) and dopaminegic receptor2 (DRD2) modulate human attention bias for palatable food or tobacco. However, the existing evidence about the modulations of COMT and DRD2 on attentional bias for facial expressions was still limited. In the study, 650 college students were genotyped with regard to COMT Val158Met and DRD2 TaqI A polymorphisms, and the attentional bias for facial expressions was assessed using the spatial cueing task. The results indicated that COMT Val158Met underpinned the individual difference in attentional bias for negative emotional expressions (P = 0.03) and the Met carriers showed more engagement bias for negative expressions than the Val/Val homozygote. On the contrary, DRD2 TaqIA underpinned the individual difference in attentional bias for positive expressions (P = 0.003) and individuals with TT genotype showed much more engagement bias for positive expressions than the individuals with CC genotype. Moreover, the two genes exerted significant interactions on the engagements for negative and positive expressions (P = 0.046, P = 0.005). These findings suggest that the individual differences in the attentional bias for emotional expressions are partially underpinned by the genetic polymorphisms in COMT and DRD2.
Project description:Reorienting of voluntary attention enables the processing of stimuli at previously unattended locations. Although studies have identified a ventral fronto-parietal network underlying attention [1, 2], little is known about whether and how early visual areas are involved in involuntary [3, 4] and even less in voluntary  reorienting, and their temporal dynamics are unknown. We used transcranial magnetic stimulation (TMS) over the occipital cortex to interfere with attentional reorienting and study its role and temporal dynamics in this process. Human observers performed an orientation discrimination task, with either valid or invalid attention cueing, across a range of stimulus contrasts. Valid cueing induced a behavioral response gain increase, higher asymptotic performance for attended than unattended locations. During subsequent TMS sessions, observers performed the same task, with high stimulus contrast. Based on phosphene mapping, TMS double pulses were applied at one of various delays to a consistent brain location in retinotopic areas (V1/V2), corresponding to the evoked signal of the target or distractor, in a valid or invalid trial. Thus, the stimulation was identical for the four experimental conditions (valid/invalid cue condition × target/distractor-stimulated). TMS modulation of the target and distractor were both periodic (5 Hz, theta) and out of phase with respect to each other in invalid trials only, when attention had to be disengaged from the distractor and reoriented to the target location. Reorientation of voluntary attention periodically involves V1/V2 at the theta frequency. These results suggest that TMS probes theta phase-reset by attentional reorienting and help link periodic sampling in time and attention reorienting in space.
Project description:Inhibition of return (IOR) is the reduction of detection speed and/or detection accuracy of a target in a recently attended location. This phenomenon, which has been discovered and studied thoroughly in humans, is believed to reflect a brain mechanism for controlling the allocation of spatial attention in a manner that enhances efficient search. Findings showing that IOR is robust, apparent at a very early age and seemingly dependent on midbrain activity suggest that IOR is a universal attentional mechanism in vertebrates. However, studies in non-mammalian species are scarce. To explore this hypothesis comparatively, we tested for IOR in barn owls (Tyto alba) using the classical Posner cueing paradigm. Two barn owls were trained to initiate a trial by fixating on the center of a computer screen and then turning their gaze to the location of a target. A short, non-informative cue appeared before the target, either at a location predicting the target (valid) or a location not predicting the target (invalid). In one barn owl, the response times (RT) to the valid targets compared to the invalid targets shifted from facilitation (lower RTs) to inhibition (higher RTs) when increasing the time lag between the cue and the target. The second owl mostly failed to maintain fixation and responded to the cue before the target onset. However, when including in the analysis only the trials in which the owl maintained fixation, an inhibition in the valid trials could be detected. To search for the neural correlates of IOR, we recorded multiunit responses in the optic tectum (OT) of four head-fixed owls passively viewing a cueing paradigm as in the behavioral experiments. At short cue to target lags (<100?ms), neural responses to the target in the receptive field (RF) were usually enhanced if the cue appeared earlier inside the RF (valid) and were suppressed if the cue appeared earlier outside the RF (invalid). This was reversed at longer lags: neural responses were suppressed in the valid conditions and were unaffected in the invalid conditions. The findings support the notion that IOR is a basic mechanism in the evolution of vertebrate behavior and suggest that the effect appears as a result of the interaction between lateral and forward inhibition in the tectal circuitry.
Project description:In three experiments, we tested whether the amount of attentional resources needed to process a face displaying neutral/angry/fearful facial expressions with direct or averted gaze depends on task instructions, and face presentation. To this end, we used a Rapid Serial Visual Presentation paradigm in which participants in Experiment 1 were first explicitly asked to discriminate whether the expression of a target face (T1) with direct or averted gaze was angry or neutral, and then to judge the orientation of a landscape (T2). Experiment 2 was identical to Experiment 1 except that participants had to discriminate the gender of the face of T1 and fearful faces were also presented randomly inter-mixed within each block of trials. Experiment 3 differed from Experiment 2 only because angry and fearful faces were never presented within the same block. The findings indicated that the presence of the attentional blink (AB) for face stimuli depends on specific combinations of gaze direction and emotional facial expressions and crucially revealed that the contextual factors (e.g., explicit instruction to process the facial expression and the presence of other emotional faces) can modify and even reverse the AB, suggesting a flexible and more contextualized deployment of attentional resources in face processing.
Project description:The rapid detection of emotional signals from facial expressions is fundamental for human social interaction. The personality factor of neuroticism modulates the processing of various types of emotional facial expressions; however, its effect on the detection of emotional facial expressions remains unclear. In this study, participants with high- and low-neuroticism scores performed a visual search task to detect normal expressions of anger and happiness, and their anti-expressions within a crowd of neutral expressions. Anti-expressions contained an amount of visual changes equivalent to those found in normal expressions compared to neutral expressions, but they were usually recognized as neutral expressions. Subjective emotional ratings in response to each facial expression stimulus were also obtained. Participants with high-neuroticism showed an overall delay in the detection of target facial expressions compared to participants with low-neuroticism. Additionally, the high-neuroticism group showed higher levels of arousal to facial expressions compared to the low-neuroticism group. These data suggest that neuroticism modulates the detection of emotional facial expressions in healthy participants; high levels of neuroticism delay overall detection of facial expressions and enhance emotional arousal in response to facial expressions.