The effect of emotional information from eyes on empathy for pain: A subliminal ERP study.
ABSTRACT: Facial expressions are deeply tied to empathy, which plays an important role during social communication. The eye region is effective at conveying facial expressions, especially fear and sadness emotions. Further, it was proved that subliminal stimuli could impact human behavior. This research aimed to explore the effect of subliminal sad, fearful and neutral emotions conveyed by the eye region on a viewer's empathy for pain using event-related potentials (ERP). The experiment used an emotional priming paradigm of 3 (prime: subliminal neutral, sad, fear eye region information) × 2 (target: painful, nonpainful pictures) within-subject design. Participants were told to judge whether the targets were in pain or not. Results showed that the subliminal sad eye stimulus elicited a larger P2 amplitude than the subliminal fearful eye stimulus when assessing pain. For P3 and late positive component (LPC), the amplitude elicited by the painful pictures was larger than the amplitude elicited by the nonpainful pictures. The behavioral results demonstrated that people reacted to targets depicting pain more slowly after the sad emotion priming. Moreover, the subjective ratings of Personal Distress (PD) (one of the dimensions in Chinese version of Interpersonal Reactivity Index scale) predicted the pain effect in empathic neural responses in the N1 and N2 time window. The current study showed that subliminal eye emotion affected the viewer's empathy for pain. Compared with the subliminal fearful eye stimulus, the subliminal sad eye stimulus had a greater impact on empathy for pain. The perceptual level of pain was deeper in the late controlled processing stage.
Project description:Previous research has demonstrated that patients with borderline personality disorder (BPD) are more sensitive to negative emotions and often show poor cognitive empathy, yet preserved or even superior emotional empathy. However, little is known about the neural correlates of empathy. Here, we examined empathy for pain in 20 patients with BPD and 19 healthy controls (HC) in a functional magnetic resonance imaging (fMRI) study, which comprised an empathy for pain paradigm showing facial emotions prior to hands exposed to painful stimuli. We found a selectively enhanced activation of the right supramarginal gyrus for painful hand pictures following painful facial expressions in BPD patients, and lower activation to nonpainful pictures following angry expressions. Patients with BPD showed less activation in the left supramarginal gyrus when viewing angry facial expressions compared to HC, independent of the pain condition. Moreover, we found differential activation of the left anterior insula, depending on the preceding facial expression exclusively in patients. The findings suggest that empathy for pain becomes selectively enhanced, depending on the emotional context information in patients with BPD. Another preliminary finding was an attenuated response to emotions in patients receiving psychotropic medication compared to unmedicated patients. These effects need to be replicated in larger samples. Together, increased activation during the observation of painful facial expressions seems to reflect emotional hypersensitivity in BPD.
Project description:The frontoinsular cortex (FI) and the anterior cingulate cortex (ACC) are thought to be involved in empathy for others' pain. However, the functional roles of FI and ACC in empathetic responses have not yet been clearly dissociated in previous studies. In this study, participants viewed color photographs depicting human body parts in painful or nonpainful situations and performed either pain judgment (painful/nonpainful) or laterality judgment (left/right) of the body parts. We found that activation of FI, rather than ACC, showed significant increase for painful compared with nonpainful images, regardless of the task requirement. Our data suggest a clear functional dissociation between FI and ACC in which FI is more domain-specific than ACC when processing empathy for pain.
Project description:The ventromedial prefrontal cortex (vmPFC) is known to be specifically involved in the processing of stimuli with pleasant, rewarding meaning to the observer. By the use of non-invasive transcranial direct current stimulation (tDCS), it was previously possible to show evidence for this valence specificity and to modulate the impact of the vmPFC on emotional network processing. Prior results showed increased neural activation during pleasant relative to unpleasant stimulus processing after excitatory compared to inhibitory vmPFC-tDCS. As dysfunctional vmPFC activation patterns are associated with major depressive disorder (MDD), tDCS of this region could render an attractive application in future therapy. Here, we investigated vmPFC-tDCS effects on sad compared to happy face processing, as sad faces are often used in the study of mood disorders. After counterbalanced inhibitory or excitatory tDCS, respectively, healthy participants viewed happy and sad faces during magnetoencephalography (MEG) recording. In addition, tDCS effects on an interpretational bias of ambiguous happy-sad face morphs and an attentional bias of a dot-probe task with happy and sad faces as emotional primes were investigated. Finally, in conjoint analyses with data from a previous sibling study (happy and fearful faces) we examined whether excitatory vmPFC-tDCS would reveal a general increase in processing of pleasant stimuli independent of the type of unpleasant stimuli applied (sad vs. fearful faces). MEG and behavioral results showed that happy faces promoted a relative positivity bias after excitatory compared to inhibitory tDCS, visible in left orbitofrontal cortex and in the emotion-primed dot-probe task. A converse pattern in the MEG data during sad face processing suggests the possible involvement of an empathy network and thus significantly differed from neuronal processing of fearful face processing. Implications for the bearing of vmPFC modulation on emotional face processing and the impact of specific unpleasant face expressions are discussed.
Project description:Both cognitive and affective processes require mental resources. However, it remains unclear whether these 2 processes work in parallel or in an integrated fashion. In this functional magnetic resonance imaging study, we investigated their interaction using an empathy-for-pain paradigm, with simultaneous manipulation of cognitive demand of the tasks and emotional valence of the stimuli. Eighteen healthy adult participants viewed photographs showing other people's hands and feet in painful or nonpainful situations while performing tasks of low (body part judgment) and high (laterality judgment) cognitive demand. Behavioral data showed increased reaction times and error rates for painful compared with nonpainful stimuli under laterality judgment relative to body part judgment, indicating an interaction between cognitive demand and stimulus valence. Imaging analyses showed activity in bilateral anterior insula (AI) and primary somatosensory cortex (SI), but not posterior insula, for main effects of cognitive demand and stimulus valence. Importantly, cognitive demand and stimulus valence showed a significant interaction in AI, SI, and regions of the frontoparietal network. These results suggest that cognitive and emotional processes at least partially share common brain networks and that AI might serve as a key node in a brain network subserving cognition-emotion integration.
Project description:One of the most robust effects of intranasal oxytocin treatment is its enhancement of emotional empathy responses across cultures to individuals displaying emotions in realistic contexts in the Multifaceted Empathy Task (MET). However, it is not established if this effect of oxytocin on emotional empathy is due to altered visual attention toward different components of the stimulus pictures or an enhanced empathic response. In the current randomized placebo-controlled within-subject experiment on 40 healthy male individuals, we both attempted a further replication of emotional empathy enhancement by intranasal oxytocin (24 IU) and used eye-tracking measures to determine if this was associated by altered visual attention toward different components of the picture stimuli (background context, human face, and body posture). Results replicated previous findings of enhanced emotional empathy in response to both negative and positive stimuli and that this was associated with an increased proportion of time viewing the faces of humans in the pictures and a corresponding decrease in that toward the rest of the body and/or background context. Overall, our findings suggest that enhanced emotional empathy following oxytocin administration is due to increased attention to the faces of others displaying emotions and away from other contextual and social cues. Clinical Trial Registration: www.ClinicalTrials.gov Oxytocin Modulates Eye Gaze Behavior During Social Processing; registration ID: NCT03293511; URL: https://clinicaltrials.gov/ct2/show/NCT03293511.
Project description:Regions of the brain network activated by painful stimuli are also activated by nonpainful and even nonsomatosensory stimuli. We therefore analyzed where the qualitative change from nonpainful to painful perception at the pain thresholds is coded. Noxious stimuli of gaseous carbon dioxide (n = 50) were applied to the nasal mucosa of 24 healthy volunteers at various concentrations from 10% below to 10% above the individual pain threshold. Functional magnetic resonance images showed that these trigeminal stimuli activated brain regions regarded as the "pain matrix." However, most of these activations, including the posterior insula, the primary and secondary somatosensory cortex, the amygdala, and the middle cingulate cortex, were associated with quantitative changes in stimulus intensity and did not exclusively reflect the qualitative change from nonpainful to pain. After subtracting brain activations associated with quantitative changes in the stimuli, the qualitative change, reflecting pain-exclusive activations, could be localized mainly in the posterior insular cortex. This shows that cerebral processing of noxious stimuli focuses predominately on the quantitative properties of stimulus intensity in both their sensory and affective dimensions, whereas the integration of this information into the perception of pain is restricted to a small part of the pain matrix.
Project description:Empathy is a mental ability that allows one person to understand the mental and emotional state of another and determines how to effectively respond to that person. When a person receives cues that another person is in pain, neural pain circuits within the brain are activated. Studies have shown that compared with non-medical staff, medical practitioners present lower empathy for pain in medical scenarios, but the mechanism of this phenomenon remains in dispute. This work investigates whether the neural correlates of empathic processes of pain are altered by professional medical knowledge. The participants were 16 medical students who were enrolled at a Chinese medical college and 16 non-medical students who were enrolled at a normal university. Participants were scanned by functional near-infrared spectroscopy while watching pictures of medical scenarios that were either painful or neutral situations. Subjects were asked to evaluate the pain intensity supposedly felt by the model in the stimulus displays, and the Interpersonal Reactivity Index-C (IRI-C) questionnaire was used to measure the empathic ability of participants. The results showed that there is no significant difference between medical professional and non-medical professional subjects in IRI-C questionnaire scores. The subjects of medical professions rated the pain degree of medical pictures significantly lower than those of non-medical professions. The activation areas in non-medical subjects were mainly located in the dorsolateral prefrontal cortex, frontal polar regions, posterior part of the inferior frontal gyrus, supramarginal gyrus, supplementary somatosensory cortex and angular gyrus, whereas there was a wide range of activation in the prefrontal lobe region in addition to the somatosensory cortex in medical professionals. These results indicate that the process of pain empathy in medical settings is influenced by medical professional knowledge.
Project description:Prior research using static facial stimuli (photographs) has identified diagnostic face regions (i.e., functional for recognition) of emotional expressions. In the current study, we aimed to determine attentional orienting, engagement, and time course of fixation on diagnostic regions. To this end, we assessed the eye movements of observers inspecting dynamic expressions that changed from a neutral to an emotional face. A new stimulus set (KDEF-dyn) was developed, which comprises 240 video-clips of 40 human models portraying six basic emotions (happy, sad, angry, fearful, disgusted, and surprised). For validation purposes, 72 observers categorized the expressions while gaze behavior was measured (probability of first fixation, entry time, gaze duration, and number of fixations). Specific visual scanpath profiles characterized each emotional expression: The eye region was looked at earlier and longer for angry and sad faces; the mouth region, for happy faces; and the nose/cheek region, for disgusted faces; the eye and the mouth regions attracted attention in a more balanced manner for surprise and fear. These profiles reflected enhanced selective attention to expression-specific diagnostic face regions. The KDEF-dyn stimuli and the validation data will be available to the scientific community as a useful tool for research on emotional facial expression processing.
Project description:Observation of others in pain induces positive elevation (pain effect) in late event-related potentials (ERP). This effect is associated with top-down attention regulating processes. It has previously been shown that stimulus exposure duration can affect top-down attentional modulation of response to threat-related stimuli. We investigated the effect of exposure duration on ERP response to others in pain. Two late ERP components, P3 and late positive potentials (LPP), from 18 healthy people were measured while they viewed pictures of hands in painful or neutral situations for either 200 or 500 ms, during two task conditions (pain judgment and counting hands). P3 and LPP pain effects during the pain judgment condition were significantly greater with 500 ms than 200 ms stimulus presentation. Ours is the first study to suggest that engagement of empathy-related self-regulatory processes reflected in late potentials requires longer exposure to the pain-related stimulus. Although this is important information about the relationship between early sensory and subsequent brain processing, and about engagement of self-regulatory processes, the neural basis of this time-dependence remains unclear. It might be important to investigate the relationship between stimulus duration and empathic response in clinical populations where issues of self-regulation, empathic response and speed of information processing exist.
Project description:Past studies have found asymmetry biases in human emotion recognition. The left side bias refers to preferential looking at the left-hemiface when actively exploring face images. However, these studies have been mainly conducted with static and frontally oriented stimuli, whereas real-life emotion recognition takes place on dynamic faces viewed from different angles. The aim of this study was to assess the judgment of genuine vs. masked expressions in dynamic movie clips of faces rotated to the right or left side. Forty-eight participants judged the expressions on faces displaying genuine or masked happy, sad, and fearful emotions. The head of the actor was either rotated to the left by a 45° angle, thus showing the left side of the face (standard orientation), or inverted, with the same face shown from the right side perspective. The eye movements were registered by the eye tracker and the data were analyzed for the inverse efficiency score (IES), the number of fixations, gaze time on the whole face and in the regions of interest. Results showed shorter IESs and gaze times for happy compared to sad and fearful emotions, but no difference was found for these variables between sad and fearful emotions. The left side preference was evident from comparisons of the number of fixations. Standard stimuli received a higher number of fixations than inverted ones. However, gaze time was long on inverted compared to standard faces. Number of fixations on exposed hemiface interacted with the emotions decreasing from happy to sad and fearful. An opposite pattern was found for the occluded hemiface. These results suggest a change in fixation patterns in the rotated faces that may be beneficial for the judgments of expressions. Furthermore, this study replicated the effects of the judgment of genuine and masked emotions using dynamic faces.