Testosterone-dependent secondary sexual characteristics in males may signal immunological competence and are sexually selected for in several species. In humans, oestrogen-dependent characteristics of the female body correlate with health and reproductive fitness and are found attractive. Enhancing the sexual dimorphism of human faces should raise attractiveness by enhancing sex-hormone-related cues to youth and fertility in females, and to dominance and immunocompetence in males. Here we report the results of asking subjects to choose the most attractive faces from continua that enhanced or diminished differences between the average shape of female and male faces. As predicted, subjects preferred feminized to average shapes of a female face. This preference applied across UK and Japanese populations but was stronger for within-population judgements, which indicates that attractiveness cues are learned. Subjects preferred feminized to average or masculinized shapes of a male face. Enhancing masculine facial characteristics increased both perceived dominance and negative attributions (for example, coldness or dishonesty) relevant to relationships and paternal investment. These results indicate a selection pressure that limits sexual dimorphism and encourages neoteny in humans.
The amygdala is thought to play a crucial role in emotional and social behaviour. Animal studies implicate the amygdala in both fear conditioning and face perception. In humans, lesions of the amygdala can lead to selective deficits in the recognition of fearful facial expressions and impaired fear conditioning, and direct electrical stimulation evokes fearful emotional responses. Here we report direct in vivo evidence of a differential neural response in the human amygdala to facial expressions of fear and happiness. Positron-emission tomography (PET) measures of neural activity were acquired while subjects viewed photographs of fearful or happy faces, varying systematically in emotional intensity. The neuronal response in the left amygdala was significantly greater to fearful as opposed to happy expressions. Furthermore, this response showed a significant interaction with the intensity of emotion (increasing with increasing fearfulness, decreasing with increasing happiness). The findings provide direct evidence that the human amygdala is engaged in processing the emotional salience of faces, with a specificity of response to fearful facial expressions.
Recognition of facial expressions is critical to our appreciation of the social and physical environment, with separate emotions having distinct facial expressions. Perception of fearful facial expressions has been extensively studied, appearing to depend upon the amygdala. Disgust-literally 'bad taste'-is another important emotion, with a distinct evolutionary history, and is conveyed by a characteristic facial expression. We have used functional magnetic resonance imaging (fMRI) to examine the neural substrate for perceiving disgust expressions. Normal volunteers were presented with faces showing mild or strong disgust or fear. Cerebral activation in response to these stimuli was contrasted with that for neutral faces. Results for fear generally confirmed previous positron emission tomography findings of amygdala involvement. Both strong and mild expressions of disgust activated anterior insular cortex but not the amygdala; strong disgust also activated structures linked to a limbic cortico-striatal-thalamic circuit. The anterior insula is known to be involved in responses to offensive tastes. The neural response to facial expressions of disgust in others is thus closely related to appraisal of distasteful stimuli.
The attractiveness of a face is a highly salient social signal, influencing mate choice and other social judgements. In this study, we used event-related functional magnetic resonance imaging (fMRI) to investigate brain regions that respond to attractive faces which manifested either a neutral or mildly happy face expression. Attractive faces produced activation of medial orbitofrontal cortex (OFC), a region involved in representing stimulus-reward value. Responses in this region were further enhanced by a smiling facial expression, suggesting that the reward value of an attractive face as indexed by medial OFC activity is modulated by a perceiver directed smile.
Previous neuroimaging and neuropsychological studies have investigated the neural substrates which mediate responses to fearful, disgusted and happy expressions. No previous studies have investigated the neural substrates which mediate responses to sad and angry expressions. Using functional neuroimaging, we tested two hypotheses. First, we tested whether the amygdala has a neural response to sad and/or angry facial expressions. Secondly, we tested whether the orbitofrontal cortex has a specific neural response to angry facial expressions. Volunteer subjects were scanned, using PET, while they performed a sex discrimination task involving static grey-scale images of faces expressing varying degrees of sadness and anger. We found that increasing intensity of sad facial expression was associated with enhanced activity in the left amygdala and right temporal pole. In addition, we found that increasing intensity of angry facial expression was associated with enhanced activity in the orbitofrontal and anterior cingulate cortex. We found no support for the suggestion that angry expressions generate a signal in the amygdala. The results provide evidence for dissociable, but interlocking, systems for the processing of distinct categories of negative facial expression.
Of 497 single neurones recorded in the cortex in the fundus of the superior temporal sulcus (STS) of three alert rhesus monkeys, a population of at least 48 cells which were selectively responsive to faces had the following response properties: (1) The cells' responses to faces (real or projected, human or rhesus monkey) were two to ten times as large as those to gratings, simple geometrical stimuli or complex 3-D objects. (2) Neuronal responses to faces were excitatory, sustained and were time-locked to the stimulus presentation with a latency of between 80 and 160 ms. (3) The cells were unresponsive to auditory or tactile stimuli and to the sight of arousing or aversive stimuli. (4) The magnitude of the responses of 28 cells tested was relatively constant despite transformations, such as rotation, so that the face was inverted or horizontal, and alterations of colour, size or distance. (5) Rotation to profile substantially reduced the responses of 21 cells (31 tested). (6) Masking out or presenting parts of the face (i.e. eyes, mouth or hair) in isolation revealed that different cells responded to different features or subsets of features. (7) For several cells, responses to the normal organisation of cut-out or line-drawn facial features were significantly larger than to jumbled controls. These findings indicate that explanations in terms of arousal, emotional or motor reactions, simple visual feature sensitivity or receptive fields are insufficient to account for the selective responses to faces and face features observed in this population of STS neurones. It appears that these neurones are part of a system specialised to code for faces or features present in faces, and it is suggested that damage to this system is related to prosopagnosia, or difficulty in face recognition, in man and to the tameness and social disturbances which follow temporal lobe damage and are part of the Klüver-Bucy syndrome in the monkey.
Various deficits in the cognitive functioning of people with autism have been documented in recent years but these provide only partial explanations for the condition.We focus instead on an imitative disturbance involving difficulties both in copying actions and in inhibiting more stereotyped mimicking, such as echolalia. A candidate for the neural basis of this disturbance may be found in a recently discovered class of neurons in frontal cortex, 'mirror neurons' (MNs). These neurons show activity in relation both to specific actions performed by self and matching actions performed by others,
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.