Preserved Proactive Control in Ageing: A Stroop Study With Emotional Faces vs. Words.
ABSTRACT: Previous studies regarding age-related changes in proactive control were inconclusive and the effects of emotion on proactive control in ageing are yet to be determined. Here, we assessed the role of task-relevant emotion on proactive control in younger and older adults. Proactive control was manipulated by varying the proportion of conflict trials in an emotional Stroop task. In Experiment 1, emotional target faces with congruent, incongruent or non-word distractor labels were used to assess proactive control in younger and older adults. To investigate whether the effects of emotion are consistent across different stimulus types, emotional target words with congruent, incongruent or obscured distractor faces were used in Experiment 2. Data from this study showed that older adults successfully deployed proactive control when needed and that task-relevant emotion affected cognitive control similarly in both age groups. It was also found that the effects of emotion on cognitive performance were qualitatively different for faces and words, with facilitating effects being observed for happy faces and for negative words. Overall, these results suggest that the effects of emotion and age on proactive control depend on the task at hand and the chosen stimulus set.
Project description:Is gender-emotion stereotype a "one-hundred percent" top-down processing phenomenon, or are there additional contributions to cognitive processing from background clues when they are related to stereotypes? In the present study, we measured the gender-emotion stereotypes of 57 undergraduates with a face recall task and found that, regardless of whether the emotional expressions of distractors were congruent or incongruent with targets, people tended to misperceive the fearful faces of men as angry and the angry faces of women as fearful. In particular, there was a significantly larger effect in the distractor-incongruent condition. The revised process-dissociation procedure analysis confirmed that both automatic and controlled processing have their own independent effects on gender-emotion stereotypes. This finding supports a dual-processing perspective on stereotypes and contributes to future research in both theory and methodology.
Project description:To investigate the emotional conflict processing during the processing of emotional stimuli in individuals with different levels of social adjustment through developing an event-related potential (ERP) method, the study used positive words (happy), negative words (disgusted), positive faces and negative faces as experimental materials for a face-word Stroop emotional conflict task, which was completed by 34 participants. For the N2 component, there was a significant difference between the high and low social adjustment groups for the congruent condition; the low social adjustment group evoked more negative amplitude under the congruent condition. Under the incongruent condition, there was a marginally significant difference between the high and low social adjustment groups; the low social adjustment group evoked more negative amplitude under the incongruent condition. For the SP component, there were no significant differences for both the high and low social adjustment group between the congruent and incongruent conditions of emotional conflict. However, within the low social adjustment group, the incongruent evoked more positive amplitude. Our findings indicate that the difference in the emotional conflict process between individuals with high and low social adjustment mainly lies in the early processing stages of emotional information. That is, for both congruent and incongruent emotional stimuli, individuals with high social adjustment showed better emotional conflict monitoring, used less cognitive resources, and had a higher degree of automated processing than those with low social adjustment. During the later stages of emotional conflict processing, individuals with low social adjustment showed poorer conflict processing.
Project description:Although Schizophrenia (SCZ) and Autism Spectrum Disorder (ASD) share impairments in emotion recognition, the mechanisms underlying these impairments may differ. The current study used the novel "Emotions in Context" task to examine how the interpretation and visual inspection of facial affect is modulated by congruent and incongruent emotional contexts in SCZ and ASD. Both adults with SCZ (n= 44) and those with ASD (n= 21) exhibited reduced affect recognition relative to typically-developing (TD) controls (n= 39) when faces were integrated within broader emotional scenes but not when they were presented in isolation, underscoring the importance of using stimuli that better approximate real-world contexts. Additionally, viewing faces within congruent emotional scenes improved accuracy and visual attention to the face for controls more so than the clinical groups, suggesting that individuals with SCZ and ASD may not benefit from the presence of complementary emotional information as readily as controls. Despite these similarities, important distinctions between SCZ and ASD were found. In every condition, IQ was related to emotion-recognition accuracy for the SCZ group but not for the ASD or TD groups. Further, only the ASD group failed to increase their visual attention to faces in incongruent emotional scenes, suggesting a lower reliance on facial information within ambiguous emotional contexts relative to congruent ones. Collectively, these findings highlight both shared and distinct social cognitive processes in SCZ and ASD that may contribute to their characteristic social disabilities.
Project description:(1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20-31 years; middle-aged: 44-55 years; older adults: 70-81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.
Project description:This study investigated neural processing interactions during Stroop interference by varying the temporal separation of relevant and irrelevant features of congruent, neutral, and incongruent colored-bar/color-word stimulus components. High-density event-related potentials (ERPs) and behavioral performance were measured as participants reported the bar color as quickly as possible, while ignoring the color words. The task-irrelevant color words could appear at 1 of 5 stimulus onset asynchronies (SOAs) relative to the task-relevant bar-color occurrence: -200 or -100 ms before, +100 or +200 ms after, or simultaneously. Incongruent relative to congruent presentations elicited slower reaction times and higher error rates (with neutral in between), and ERP difference waves containing both an early, negative-polarity, central-parietal deflection, and a later, more left-sided, positive-polarity component. These congruency-related differences interacted with SOA, showing the greatest behavioral and electrophysiological effects when irrelevant stimulus information preceded the task-relevant target and reduced effects when the irrelevant information followed the relevant target. We interpret these data as reflecting 2 separate processes: 1) a 'priming influence' that enhances the magnitude of conflict-related facilitation and conflict-related interference when a task-relevant target is preceded by an irrelevant distractor; and 2) a reduced 'backward influence' of stimulus conflict when the irrelevant distractor information follows the task-relevant target.
Project description:Emotions are commonly recognized by combining auditory and visual signals (i.e., vocal and facial expressions). Yet it is unknown whether the ability to link emotional signals across modalities depends on early experience with audio-visual stimuli. In the present study, we investigated the role of auditory experience at different stages of development for auditory, visual, and multisensory emotion recognition abilities in three groups of adolescent and adult cochlear implant (CI) users. CI users had a different deafness onset and were compared to three groups of age- and gender-matched hearing control participants. We hypothesized that congenitally deaf (CD) but not early deaf (ED) and late deaf (LD) CI users would show reduced multisensory interactions and a higher visual dominance in emotion perception than their hearing controls. The CD (n = 7), ED (deafness onset: <3 years of age; n = 7), and LD (deafness onset: >3 years; n = 13) CI users and the control participants performed an emotion recognition task with auditory, visual, and audio-visual emotionally congruent and incongruent nonsense speech stimuli. In different blocks, participants judged either the vocal (Voice task) or the facial expressions (Face task). In the Voice task, all three CI groups performed overall less efficiently than their respective controls and experienced higher interference from incongruent facial information. Furthermore, the ED CI users benefitted more than their controls from congruent faces and the CD CI users showed an analogous trend. In the Face task, recognition efficiency of the CI users and controls did not differ. Our results suggest that CI users acquire multisensory interactions to some degree, even after congenital deafness. When judging affective prosody they appear impaired and more strongly biased by concurrent facial information than typically hearing individuals. We speculate that limitations inherent to the CI contribute to these group differences.
Project description:The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.
Project description:Among nicotine-dependent smokers, smoking abstinence disrupts multiple cognitive and affective processes including conflict resolution and emotional information processing (EIP). However, the neurobiological basis of abstinence effects on resolving emotional interference on cognition remains largely uncharacterized. In this study, functional magnetic resonance imaging (fMRI) was used to investigate smoking abstinence effects on emotion-cognition interactions.Smokers (n?=?17) underwent fMRI while performing an affective Stroop task (aST) over two sessions: once following 24-h abstinence and once following smoking as usual. The aST includes trials that serially present incongruent or congruent numerical grids bracketed by neutral or negative emotional distractors and view-only emotional image trials. Statistical analyses were conducted using a statistical threshold of p?<?0.05 cluster corrected.Smoking abstinence increased Stroop blood-oxygenation-level-dependent response in the right middle frontal and rostral anterior cingulate gyri. Moreover, withdrawal-induced negative affect was associated with less activation in frontoparietal regions during negative emotional information processing; whereas, during Stroop trials, negative affect predicted greater activation in frontal regions during negative, but not neutral emotional distractor trials.Hyperactivation in the frontal executive control network during smoking abstinence may represent a need to recruit additional executive resources to meet task demands. Moreover, abstinence-induced negative affect may disrupt cognitive control neural circuitry during EIP and place additional demands on frontal executive neural resources during cognitive demands when presented with emotionally distracting stimuli.
Project description:Unconscious processes are often assumed immune from attention influence. Recent behavioral studies suggest however that the processing of subliminal information can be influenced by temporal attention. To examine the neural mechanisms underlying these effects, we used a stringent masking paradigm together with fMRI to investigate how temporal attention modulates the processing of unseen (masked) faces. Participants performed a gender decision task on a visible neutral target face, preceded by a masked prime face that could vary in gender (same or different than target) and emotion expression (neutral or fearful). We manipulated temporal attention by instructing participants to expect targets to appear either early or late during the stimulus sequence. Orienting temporal attention to subliminal primes influenced response priming by masked faces, even when gender was incongruent. In addition, gender-congruent primes facilitated responses regardless of attention while gender-incongruent primes reduced accuracy when attended. Emotion produced no differential effects. At the neural level, incongruent and temporally unexpected primes increased brain response in regions of the fronto-parietal attention network, reflecting greater recruitment of executive control and reorienting processes. Congruent and expected primes produced higher activations in fusiform cortex, presumably reflecting facilitation of perceptual processing. These results indicate that temporal attention can influence subliminal processing of face features, and thus facilitate information integration according to task-relevance regardless of conscious awareness. They also suggest that task-congruent information between prime and target may facilitate response priming even when temporal attention is not selectively oriented to the prime onset time.
Project description:According to embodied cognition accounts, viewing others' facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others' facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others' faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions' order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.