Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.
Emotional facial expressions provide critical information for social interactions. Above all, angry faces are assumed to reflect potential social threat. We investigated event-related potentials (ERPs) triggered by natural and artificial faces expressing fear, anger, happiness or no emotion in participants with low and high levels of social anxiety. Overall, artificial faces elicited stronger P100 and N170 responses than natural faces. Additionally, the N170 component was larger for emotional compared to neutral facial expressions. Social anxiety was associated with an enhanced emotional modulation of the early posterior negativity (EPN) in response to fearful and angry facial expressions. Additionally, while the late positive potential (LPP) was larger for emotional than for neutral faces in low socially anxious participants, LPPs of higher socially anxious participants did not differ. LPPs might therefore be enhanced in higher socially anxious participants for both emotional and neutral faces. Furthermore, the modulations of the EPN and LPP were comparable between natural and artificial faces. These results indicate that social anxiety influences early perceptual processing of faces and that artificial faces are suitable for psychophysiological emotion research.
Our first impression of others is highly influenced by their facial appearance. However, the perception and evaluation of faces is not only guided by internal features such as facial expressions, but also highly dependent on contextual information such as secondhand information (verbal descriptions) about the target person. To investigate the time course of contextual influences on cortical face processing, event-related brain potentials were investigated in response to neutral faces, which were preceded by brief verbal descriptions containing cues of affective valence (negative, neutral, positive) and self-reference (self-related vs. other-related). ERP analysis demonstrated that early and late stages of face processing are enhanced by negative and positive as well as self-relevant descriptions, although faces per se did not differ perceptually. Affective ratings of the faces confirmed these findings. Altogether, these results demonstrate for the first time both on an electrocortical and behavioral level how contextual information modifies early visual perception in a top-down manner.
Numerous studies have shown that humans automatically react with congruent facial reactions, i.e., facial mimicry, when seeing a vis-á-vis' facial expressions. The current experiment is the first investigating the neuronal structures responsible for differences in the occurrence of such facial mimicry reactions by simultaneously measuring BOLD and facial EMG in an MRI scanner. Therefore, 20 female students viewed emotional facial expressions (happy, sad, and angry) of male and female avatar characters. During picture presentation, the BOLD signal as well as M. zygomaticus major and M. corrugator supercilii activity were recorded simultaneously. Results show prototypical patterns of facial mimicry after correction for MR-related artifacts: enhanced M. zygomaticus major activity in response to happy and enhanced M. corrugator supercilii activity in response to sad and angry expressions. Regression analyses show that these congruent facial reactions correlate significantly with activations in the IFG, SMA, and cerebellum. Stronger zygomaticus reactions to happy faces were further associated to increased activities in the caudate, MTG, and PCC. Corrugator reactions to angry expressions were further correlated with the hippocampus, insula, and STS. Results are discussed in relation to core and extended models of the mirror neuron system (MNS).
Visual emotional stimuli evoke enhanced activation in early visual cortex areas which may help organisms to quickly detect biologically salient cues and initiate appropriate approach or avoidance behavior. Functional neuroimaging evidence for the modulation of other sensory modalities by emotion is scarce. Therefore, the aim of the present study was to test whether sensory facilitation by emotional cues can also be found in the auditory domain. We recorded auditory brain activation with functional near-infrared-spectroscopy (fNIRS), a non-invasive and silent neuroimaging technique, while participants were listening to standardized pleasant, unpleasant, and neutral sounds selected from the International Affective Digitized Sound System (IADS). Pleasant and unpleasant sounds led to increased auditory cortex activation as compared to neutral sounds. This is the first study to suggest that the enhanced activation of sensory areas in response to complex emotional stimuli is apparently not restricted to the visual domain but is also evident in the auditory domain.
The hypervigilance-avoidance hypothesis assumes that anxious individuals initially attend to and subsequently avoid threatening stimuli. In this study pairs of emotional (angry or happy) and neutral facial expressions were presented to students of high or low fear of negative evaluation (FNE) while their eye movements were recorded. High FNE participants initially looked more often at emotional compared to neutral faces, indicating an attentional bias for emotional facial expressions. This effect was further modulated by the sex of the face, as high FNE clearly showed a preference for happy female faces. Analysis of the time course of attention revealed that high FNE looked at the emotional faces longer during the first second of stimulus exposure, whereas they avoided these faces in the consecutive time interval from 1 to 1.5 s. These results partially support the hypervigilance-avoidance hypothesis and additionally indicate the relevance of happy faces for high FNE. Further research should clarify the meaning of happy facial expressions as well as the influence of the sex of the observed face in social anxiety.
Perception and evaluation of facial expressions are known to be heavily modulated by emotional features of contextual information. Such contextual effects, however, might also be driven by non-emotional aspects of contextual information, an interaction of emotional and non-emotional factors, and by the observers’ inherent traits. Therefore, we sought to assess whether contextual information about self-reference in addition to information about valence influences the evaluation and neural processing of neutral faces. Furthermore, we investigated whether social anxiety moderates these effects. In the present functional magnetic resonance imaging (fMRI) study, participants viewed neutral facial expressions preceded by a contextual sentence conveying either positive or negative evaluations about the participant or about somebody else. Contextual influences were reflected in rating and fMRI measures, with strong effects of self-reference on brain activity in the medial prefrontal cortex and right fusiform gyrus. Additionally, social anxiety strongly affected the response to faces conveying negative, self-related evaluations as revealed by the participants’ rating patterns and brain activity in cortical midline structures and regions of interest in the left and right middle frontal gyrus. These results suggest that face perception and processing are highly individual processes influenced by emotional and non-emotional aspects of contextual information and further modulated by individual personality traits.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.