首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
People with Huntington's disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

2.
Although older adults have difficulty recognizing all facial emotions, they have particular difficulty decoding expressions of anger. Since disruption of facial mimicry impairs emotion recognition, electromyography of the corrugator supercilii (i.e., brow) muscle region was used to test whether there are age differences in anger mimicry. Associations between mimicry and emotion recognition were also assessed. The results indicated that although there were no age differences in anger mimicry, older (but not young) adults' corrugator responses to angry expressions were associated with reduced anger recognition. Implications for understanding emotion recognition difficulties in older adulthood are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Which brain regions are associated with recognition of emotional prosody? Are these distinct from those for recognition of facial expression? These issues were investigated by mapping the overlaps of co-registered lesions from 66 brain-damaged participants as a function of their performance in rating basic emotions. It was found that recognizing emotions from prosody draws on the right frontoparietal operculum, the bilateral frontal pole, and the left frontal operculum. Recognizing emotions from prosody and facial expressions draws on the right frontoparietal cortex, which may be important in reconstructing aspects of the emotion signaled by the stimulus. Furthermore, there were regions in the left and right temporal lobes that contributed disproportionately to recognition of emotion from faces or prosody, respectively. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

5.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
Investigated the degree to which 4–5 yr olds (n?=?48) can enact expressions of emotion recognizable by peers and adults; the study also examined whether accuracy of recognition was a function of age and whether the expression was posed or spontaneous. Adults (n?=?103) were much more accurate than children in recognizing neutral states, slightly more accurate in recognizing happiness and anger, and equally accurate in recognizing sadness. Children's spontaneous displays of happiness were more recognizable than posed displays, but for other emotions there was no difference between the recognizability of posed and spontaneous expressions. Children were highly accurate in identifying the facial expressions of happiness, sadness, and anger displayed by their peers. Sex and ethnicity of the child whose emotion was displayed interacted to influence only adults' recognizability of anger. Results are discussed in terms of the social learning and cognitive developmental factors influencing (a) adults' and children's decoding (recognition) of emotional expressions in young children and (b) encoding (posing) of emotional expressions by young children. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
While humans are adept at recognizing emotional states conveyed by facial expressions, the current literature suggests that they lack accurate metacognitions about their performance in this domain. This finding comes from global trait-based questionnaires that assess the extent to which an individual perceives him or herself as empathic, as compared to other people. Those who rate themselves as empathically accurate are no better than others at recognizing emotions. Metacognition of emotion recognition can also be assessed using relative measures that evaluate how well a person thinks s/he has understood the emotion in a particular facial display as compared to other displays. While this is the most common method of metacognitive assessment of people's judgments of learning or their feelings of knowing, this kind of metacognition—“relative meta-accuracy”—has not been studied within the domain of emotion. As well as asking for global metacognitive judgments, we asked people to provide relative, trial-by-trial prospective and retrospective judgments concerning whether they would be right or wrong in recognizing the expressions conveyed in particular facial displays. Our question was: Do people know when they will be correct in knowing what expression is conveyed, and do they know when they do not know? Although we, like others, found that global meta-accuracy was unpredictive of performance, relative meta-accuracy, given by the correlation between participants' trial-by-trial metacognitive judgments and performance on each item, were highly accurate both on the Mind in the Eyes task (Experiment 1) and on the Ekman Emotional Expression Multimorph task (in Experiment 2). (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

8.
Certain facial expressions have been theorized to be easily recognizable signals of specific emotions. If so, these expressions should override situationally based expectations used by a person in attributing an emotion to another. An alternative account is offered in which the face provides information relevant to emotion but does not signal a specific emotion. Therefore, in specified circumstances, situational rather than facial information was predicted to determine the judged emotion. This prediction was supported in 3 studies—indeed, in each of the 22 cases examined (e.g., a person in a frightening situation but displaying a reported "facial expression of anger"" was judged as afraid). Situational information was especially influential when it suggested a nonbasic emotion (e.g., a person in a painful situation but displaying a "facial expression of fear"" was judged as in pain ). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Investigated the recognition of, and responses to, facial expressions of emotion. Ss were all women and consisted of the following groups: (1) 16 depressed college students, (2) 16 nondepressed college students, (3) 16 depressed psychiatric patients, and (4) 11 nondepressed psychiatric patients. Results suggest that both depressed groups, relative to the nondepressed college group, made more errors in recognizing the facial expressions and reported more freezing or tensing, higher fear and depression reactions, and less comfort with their own emotional reactions to these expressions and a stronger desire to change these reactions. Few differences were found between the depressed psychiatric patients and the psychiatric control Ss. It is concluded that inappropriate reactions to others' emotions may maintain or increase depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
This article discusses the controversy over whether attribution (recognition) of emotions from facial expressions is universal (P. Ekman, 1994; C. E. Izard, 1994; J. A. Russell, 1994). Agreement emerged on various issues. There exists at least Minimal Universality (people everywhere can infer something about others from their facial behavior). Anger, sadness, and other semantic categories for emotion are not pancultural and are not the precise messages conveyed by facial expressions. Emotions can occur without facial expressions, and facial expressions can occur without emotions. Further evidence is needed to determine the relationship between emotion and facial behavior, what determines that relationship, how facial behavior is interpreted, and how much the interpretation varies with culture and language. Ekman's (1994) objections are answered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Examined whether spontaneous facial expressions provide observers with sufficient information to distinguish accurately which of 7 affective states (6 emotional and 1 neutral) is being experienced by another person. Six undergraduate senders' facial expressions were covertly videotaped as they watched emotionally loaded slides. After each slide, senders nominated the emotions term that best described their affective reaction and also rated the pleasantness and strength of that reaction. Similar nominations of emotion terms and ratings were later made by 53 undergraduate receivers who viewed the senders' videotaped facial expression. The central measure of communication accuracy was the match between senders' and receivers' emotion nominations. Overall accuracy was significantly greater than chance, although it was not impressive in absolute terms. Only happy, angry, and disgusted expressions were recognized at above-chance rates, whereas surprised expressions were recognized at rates that were significantly worse than chance. Female Ss were significantly better senders than were male Ss. Although neither sex was found to be better at receiving facial expressions, female Ss were better receivers of female senders' expressions than of male senders' expressions. Female senders' neutral and surprised expressions were more accurately recognized than were those of male senders. The only sex difference found for decoding emotions was a tendency for male Ss to be more accurate at recognizing anger. (25 ref) (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

13.
It has been proposed that self-face representations are involved in interpreting facial emotions of others. We experimentally primed participants' self-face representations. In Study 1, we assessed eye tracking patterns and performance on a facial emotion discrimination task, and in Study 2, we assessed emotion ratings between self and nonself groups. Results show that experimental priming of self-face representations increases visual exploration of faces, facilitates the speed of facial expression processing, and increases the emotional distance between expressions. These findings suggest that the ability to interpret facial expressions of others is intimately associated with the representations we have of our own faces. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

14.
Describes and compares normative emotional responses in solitary play, and determines developmental changes associated with expression of emotion in play. A developmental perspective on emotions is described. The results from 3 studies that examined the expression of emotion (facial expressions) during infants' and children's (aged 6 mo–5 yrs) solitary play are discussed as a foundation from which to consider the functions of emotion in play therapy. Facial expressions of emotions were assessed using the System for Identifying Affect Expressions by Holistic Judgments (C. E. Izard et al, 1983). Exploration and play were measured using a standardized scale developed by J. Belsy and R. K. Most (1981). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Cortical contributions to human emotional expression are examined with a focus on interhemispheric (right vs left) and intrahemispheric (anterior vs posterior) mechanisms. This article reviews behavioral studies of emotional expression in brain-damaged patients with unilateral lesions and in normal adults. Studies involving facial, prosodic, and lexical (verbal) communication channels are reviewed for patients; facial asymmetry studies are reviewed for normal Ss. Data are presented separately for posed and spontaneous conditions and for positive and negative emotions. Findings support right-hemisphere dominance for emotional expression, especially for prosodic and lexical expression in brain-damaged patients and for facial expression in normal Ss. Methodological factors are suggested to account for differences among facial expressions studies in brain-damaged patients. The data are discussed in terms of neuropsychological theories of emotion and directions for future research. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
18.
Studies in animals have shown that the amygdala receives highly processed visual input, contains neurons that respond selectively to faces, and that it participates in emotion and social behaviour. Although studies in epileptic patients support its role in emotion, determination of the amygdala's function in humans has been hampered by the rarity of patients with selective amygdala lesions. Here, with the help of one such rare patient, we report findings that suggest the human amygdala may be indispensable to: (1) recognize fear in facial expressions; (2) recognize multiple emotions in a single facial expression; but (3) is not required to recognize personal identity from faces. These results suggest that damage restricted to the amygdala causes very specific recognition impairments, and thus constrains the broad notion that the amygdala is involved in emotion.  相似文献   

19.
This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
It is generally agreed that schizophrenia patients show a markedly reduced ability to perceive and express facial emotions. Previous studies have shown, however, that such deficits are emotion-specific in schizophrenia and not generalized. Three kinds of studies were examined: decoding studies dealing with schizophrenia patients' ability to perceive universally recognized facial expressions of emotions, encoding studies dealing with schizophrenia patients' ability to express certain facial emotions, and studies of subjective reactions of patients' sensitivity toward universally recognized facial expressions of emotions. A review of these studies shows that schizophrenia patients, despite a general impairment of perception or expression of facial emotions, are highly sensitive to certain negative emotions of fear and anger. These observations are discussed in the light of hemispheric theory, which accounts for a generalized performance deficit, and social-cognitive theory, which accounts for an emotion-specific deficit in schizophrenia.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号