Ventromedial frontal lobe damage affects interpretation, not exploration, of emotional facial expressions.
ABSTRACT: Recognizing and distinguishing the emotional states of those around us is crucial for adaptive social behavior. Previous work has shown that damage to the ventromedial frontal lobe (VMF) impairs recognition of subtle emotional facial expressions and affects fixation patterns to face stimuli. However, whether this relates to deficits in acquiring or interpreting facial expression information remains unclear. We tested 37 patients with frontal lobe damage, including 17 subjects with VMF lesions, in a series of emotion recognition tasks with different gaze manipulations. Subjects were asked to rate neutral, subtle and extreme emotional expressions while freely examining faces, while instructed to look only at the eyes, and in a gaze-contingent condition that required top-down direction of eye movements to reveal the stimulus. People with VMF damage were worse at detecting subtle disgust during free viewing and confused extreme emotional expressions more than healthy controls. However, fixation patterns did not differ systematically between groups during free or gaze-contingent viewing conditions. Moreover, instruction to fixate only the eyes did not improve the performance of VMF damaged subjects. These data argue that VMF is not necessary for normal fixations to emotional face stimuli, and that impairments in emotion recognition after VMF damage do not stem from impaired information gathering, as indexed by patterns of fixation.
Project description:Appropriate response to companions' emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs' gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs' gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics' faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates.
Project description:Prior research using static facial stimuli (photographs) has identified diagnostic face regions (i.e., functional for recognition) of emotional expressions. In the current study, we aimed to determine attentional orienting, engagement, and time course of fixation on diagnostic regions. To this end, we assessed the eye movements of observers inspecting dynamic expressions that changed from a neutral to an emotional face. A new stimulus set (KDEF-dyn) was developed, which comprises 240 video-clips of 40 human models portraying six basic emotions (happy, sad, angry, fearful, disgusted, and surprised). For validation purposes, 72 observers categorized the expressions while gaze behavior was measured (probability of first fixation, entry time, gaze duration, and number of fixations). Specific visual scanpath profiles characterized each emotional expression: The eye region was looked at earlier and longer for angry and sad faces; the mouth region, for happy faces; and the nose/cheek region, for disgusted faces; the eye and the mouth regions attracted attention in a more balanced manner for surprise and fear. These profiles reflected enhanced selective attention to expression-specific diagnostic face regions. The KDEF-dyn stimuli and the validation data will be available to the scientific community as a useful tool for research on emotional facial expression processing.
Project description:Autism Spectrum Disorder (ASD), Oppositional Defiant Disorder (ODD), and Conduct Disorder (CD) are often associated with emotion recognition difficulties. This is the first eye-tracking study to examine emotional face recognition (i.e., gazing behavior) in a direct comparison of male adolescents with Autism Spectrum Disorder or Oppositional Defiant Disorder/Conduct Disorder, and typically developing (TD) individuals. We also investigate the role of psychopathic traits, callous-unemotional (CU) traits, and subtypes of aggressive behavior in emotional face recognition. A total of 122 male adolescents (N = 50 ASD, N = 44 ODD/CD, and N = 28 TD) aged 12-19 years (M = 15.4 years, SD= 1.9) were included in the current study for the eye-tracking experiment. Participants were presented with neutral and emotional faces using a Tobii 1750 eye-tracking monitor to record gaze behavior. Our main dependent eye-tracking variables were: (1) fixation duration to the eyes of a face and (2) time to the first fixation to the eyes. Since distributions of eye-tracking variables were not completely Gaussian, non-parametric tests were chosen to investigate gaze behavior across the diagnostic groups with Autism Spectrum Disorder, Oppositional Defiant Disorder/Conduct Disorder, and Typically Developing individuals. Furthermore, we used Spearman correlations to investigate the links with psychopathy, callous, and unemotional traits and subtypes of aggression as assessed by questionnaires. The relative total fixation duration to the eyes was decreased in both the Autism Spectrum Disorder group and the Oppositional Defiant Disorder/Conduct Disorder group for several emotional expressions. In both the Autism Spectrum Disorder and the Oppositional Defiant Disorder/Conduct Disorder group, increased time to first fixation on the eyes of fearful faces only was nominally significant. The time to first fixation on the eyes was nominally correlated with psychopathic traits and proactive aggression. The current findings do not support strong claims for differential cross-disorder eye-gazing deficits and for a role of shared underlying psychopathic traits, callous-unemotional traits, and aggression subtypes. Our data provide valuable and novel insights into gaze timing distributions when looking at the eyes of a fearful face.
Project description:Social exclusion has many effects on individuals, including the increased need to belong and elevated sensitivity to social information. Using a self-reporting method, and an eye-tracking technique, this study explored people's need to belong and attentional bias towards the socio-emotional information (pictures of positive and negative facial expressions compared to those of emotionally-neutral expressions) after experiencing a brief episode of social exclusion. We found that: (1) socially-excluded individuals reported higher negative emotions, lower positive emotions, and stronger need to belong than those who were not socially excluded; (2) compared to a control condition, social exclusion caused a longer response time to probe dots after viewing positive or negative face images; (3) social exclusion resulted in a higher frequency ratio of first attentional fixation on both positive and negative emotional facial pictures (but not on the neutral pictures) than the control condition; (4) in the social exclusion condition, participants showed shorter first fixation latency and longer first fixation duration to positive pictures than neutral ones but this effect was not observed for negative pictures; (5) participants who experienced social exclusion also showed longer gazing duration on the positive pictures than those who did not; although group differences also existed for the negative pictures, the gaze duration bias from both groups showed no difference from chance. This study demonstrated the emotional response to social exclusion as well as characterising multiple eye-movement indicators of attentional bias after experiencing social exclusion.
Project description:Previous studies have shown an attentional bias towards social features during free-viewing of naturalistic scenes. This social attention seems to be reflexive and able to defy top-down demands in form of explicit search tasks. However, the question remains whether social features continue to be prioritized when peripheral information is limited, thereby reducing the influence of bottom-up image information on gaze orienting. Therefore, we established a gaze-contingent viewing paradigm, in which the visual field was constrained and updated in response to the viewer's eye movements. Participants viewed social and non-social images that were randomly allocated to a free and a gaze-contingent viewing condition while their eye movements were tracked. Our results revealed a strong attentional bias towards social features in both conditions. However, gaze-contingent viewing altered temporal and spatial dynamics of viewing behavior. Additionally, recurrent fixations were more frequent and closer together in time for social compared to non-social stimuli in both viewing conditions. Taken together, this study implies a predominant selection of social features when bottom-up influences are diminished and a general influence of social content on visual exploratory behavior, thus highlighting mechanisms of social attention.
Project description:Facial expressions are a core component of the emotional response of social mammals. In contrast to Darwin's original proposition, expressive facial cues of emotion appear to have evolved to be species-specific. Faces trigger an automatic perceptual process, and so, inter-specific emotion perception is potentially a challenge; since observers should not try to "read" heterospecific facial expressions in the same way that they do conspecific ones. Using dynamic spontaneous facial expression stimuli, we report the first inter-species eye-tracking study on fully unrestrained participants and without pre-experiment training to maintain attention to stimuli, to compare how two different species living in the same ecological niche, humans and dogs, perceive each other's facial expressions of emotion. Humans and dogs showed different gaze distributions when viewing the same facial expressions of either humans or dogs. Humans modulated their gaze depending on the area of interest (AOI) being examined, emotion, and species observed, but dogs modulated their gaze depending on AOI only. We also analysed if the gaze distribution was random across AOIs in both species: in humans, eye movements were not correlated with the diagnostic facial movements occurring in the emotional expression, and in dogs, there was only a partial relationship. This suggests that the scanning of facial expressions is a relatively automatic process. Thus, to read other species' facial emotions successfully, individuals must overcome these automatic perceptual processes and employ learning strategies to appreciate the inter-species emotional repertoire.
Project description:Skilled reading requires information processing of the fixated and the not-yet-fixated words to generate precise control of gaze. Over the last 30 years, experimental research provided evidence that word processing is distributed across the perceptual span, which permits recognition of the fixated (foveal) word as well as preview of parafoveal words to the right of fixation. However, theoretical models have been unable to differentiate the specific influences of foveal and parafoveal information on saccade control. Here we show how parafoveal word difficulty modulates spatial and temporal control of gaze in a computational model to reproduce experimental results. In a fully Bayesian framework, we estimated model parameters for different models of parafoveal processing and carried out large-scale predictive simulations and model comparisons for a gaze-contingent reading experiment. We conclude that mathematical modeling of data from gaze-contingent experiments permits the precise identification of pathways from parafoveal information processing to gaze control, uncovering potential mechanisms underlying the parafoveal contribution to eye-movement control.
Project description:Individual genetic differences in the serotonin transporter-linked polymorphic region (5-HTTLPR) have been associated with variations in the sensitivity to social and emotional cues as well as altered amygdala reactivity to facial expressions of emotion. Amygdala activation has further been shown to trigger gaze changes toward diagnostically relevant facial features. The current study examined whether altered socio-emotional reactivity in variants of the 5-HTTLPR promoter polymorphism reflects individual differences in attending to diagnostic features of facial expressions. For this purpose, visual exploration of emotional facial expressions was compared between a low (n = 39) and a high (n = 40) 5-HTT expressing group of healthy human volunteers in an eye tracking paradigm. Emotional faces were presented while manipulating the initial fixation such that saccadic changes toward the eyes and toward the mouth could be identified. We found that the low vs. the high 5-HTT group demonstrated greater accuracy with regard to emotion classifications, particularly when faces were presented for a longer duration. No group differences in gaze orientation toward diagnostic facial features could be observed. However, participants in the low 5-HTT group exhibited more and faster fixation changes for certain emotions when faces were presented for a longer duration and overall face fixation times were reduced for this genotype group. These results suggest that the 5-HTT gene influences social perception by modulating the general vigilance to social cues rather than selectively affecting the pre-attentive detection of diagnostic facial features.
Project description:The current research explored toddlers' gaze fixation during a scene showing a person expressing sadness after a ball is stolen from her. The relation between the duration of gaze fixation on different parts of the person's sad face (e.g., eyes, mouth) and theory of mind skills was examined. Eye tracking data indicated that before the actor experienced the negative event, toddlers divided their fixation equally between the actor's happy face and other distracting objects, but looked longer at the face after the ball was stolen and she expressed sadness. The strongest predictor of increased focus on the sad face versus other elements of the scene was toddlers' ability to predict others' emotional reactions when outcomes fulfilled (happiness) or failed to fulfill (sadness) desires, whereas toddlers' visual perspective-taking skills predicted their more specific focusing on the actor's eyes and, for boys only, mouth. Furthermore, gender differences emerged in toddlers' fixation on parts of the scene. Taken together, these findings suggest that top-down processes are involved in the scanning of emotional facial expressions in toddlers.
Project description:<b>Background:</b> The concept of alexithymia is characterized by difficulties identifying and describing one's emotions. Alexithymic individuals are impaired in the recognition of others' emotional facial expressions. Alexithymia is quite common in patients suffering from major depressive disorder. The face-in-the-crowd task is a visual search paradigm that assesses processing of multiple facial emotions. In the present eye-tracking study, the relationship between alexithymia and visual processing of facial emotions was examined in clinical depression. <b>Materials and Methods:</b> Gaze behavior and manual response times of 20 alexithymic and 19 non-alexithymic depressed patients were compared in a face-in-the-crowd task. Alexithymia was empirically measured <i>via</i> the 20-item Toronto Alexithymia-Scale. Angry, happy, and neutral facial expressions of different individuals were shown as target and distractor stimuli. Our analyses of gaze behavior focused on latency to the target face, number of distractor faces fixated before fixating the target, number of target fixations, and number of distractor faces fixated after fixating the target. <b>Results:</b> Alexithymic patients exhibited in general slower decision latencies compared to non-alexithymic patients in the face-in-the-crowd task. Patient groups did not differ in latency to target, number of target fixations, and number of distractors fixated prior to target fixation. However, after having looked at the target, alexithymic patients fixated more distractors than non-alexithymic patients, regardless of expression condition. <b>Discussion:</b> According to our results, alexithymia goes along with impairments in visual processing of multiple facial emotions in clinical depression. Alexithymia appears to be associated with delayed manual reaction times and prolonged scanning after the first target fixation in depression, but it might have no impact on the early search phase. The observed deficits could indicate difficulties in target identification and/or decision-making when processing multiple emotional facial expressions. Impairments of alexithymic depressed patients in processing emotions in crowds of faces seem not limited to a specific affective valence. In group situations, alexithymic depressed patients might be slowed in processing interindividual differences in emotional expressions compared with non-alexithymic depressed patients. This could represent a disadvantage in understanding non-verbal communication in groups.