首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Preschool children, 2 to 5 years of age, and adults posed the six facial expressions of happiness, surprise, anger, fear, sadness, and disgust before a videotape camera. Their poses were scored subsequently using the MAX system. The number of poses that included all components of the target expression (complete expressions) as well as the frequency of those that included only some of the components of the target expressions (partial expressions) were analyzed. Results indicated that 2-year-olds as a group failed to pose any face. Three-year-olds were a transitional group, posing happiness and surprise expressions but none of the remaining faces to any degree. Four- and 5-year-olds were similar to one another and differed from adults only on surprise and anger expressions. Adults were able to pose both these expressions. No group, including adults, posed fear and disgust well. Posing of happiness showed no change after 3 years of age. Consistent differences between partial and complete poses were observed particularly for the negative expressions of sadness, fear, and disgust. Implications of these results for socialization theories of emotion are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
In an experiment with 20 undergraduates, video recordings of actors' faces covered with black makeup and white spots were played back to the Ss so that only the white spots were visible. The results demonstrate that moving displays of happiness, sadness, fear, surprise, anger, and disgust were recognized more accurately than static displays of the white spots at the apex of the expressions. This indicates that facial motion, in the absence of information about the shape and position of facial features, is informative about these basic emotions. Normally illuminated dynamic displays of these expressions, however, were recognized more accurately than displays of moving spots. The relative effectiveness of upper and lower facial areas for the recognition of the 6 emotions was also investigated using normally illuminated and spots-only displays. In both instances, the results indicate that different facial regions are more informative for different emotions. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Facial expression is heralded as a communication system common to all human populations, and thus is generally accepted as a biologically based, universal behavior. Happiness, sadness, fear, anger, surprise, and disgust are universally recognized and produced emotions, and communication of these states is deemed essential in order to navigate the social environment. It is puzzling, however, how individuals are capable of producing similar facial expressions when facial musculature is known to vary greatly among individuals. Here, the authors show that although some facial muscles are not present in all individuals, and often exhibit great asymmetry (larger or absent on one side), the facial muscles that are essential in order to produce the universal facial expressions exhibited 100% occurrence and showed minimal gross asymmetry in 18 cadavers. This explains how universal facial expression production is achieved, implies that facial muscles have been selected for essential nonverbal communicative function, and yet also accommodate individual variation. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
It is generally agreed that schizophrenia patients show a markedly reduced ability to perceive and express facial emotions. Previous studies have shown, however, that such deficits are emotion-specific in schizophrenia and not generalized. Three kinds of studies were examined: decoding studies dealing with schizophrenia patients' ability to perceive universally recognized facial expressions of emotions, encoding studies dealing with schizophrenia patients' ability to express certain facial emotions, and studies of subjective reactions of patients' sensitivity toward universally recognized facial expressions of emotions. A review of these studies shows that schizophrenia patients, despite a general impairment of perception or expression of facial emotions, are highly sensitive to certain negative emotions of fear and anger. These observations are discussed in the light of hemispheric theory, which accounts for a generalized performance deficit, and social-cognitive theory, which accounts for an emotion-specific deficit in schizophrenia.  相似文献   

5.
Fifty children and adolescents were tested for their ability to recognize the 6 basic facial expressions of emotion depicted in Ekman and Friesen's normed photographs. Subjects were presented with sets of 6 photographs of faces, each portraying a different basic emotion, and stories portraying those emotions were read to them. After each story, the subject was asked to point to the photograph in the set that depicted the emotion described. Overall, the children correctly identified the emotions on 74% of the presentations. The highest level of accuracy in recognition was for happiness, followed by sadness, with fear being the emotional expression that was mistaken most often. When compared to studies of children in the general population, children with ADHD have deficits in their ability to accurately recognize facial expressions of emotion. These findings have important implications for the remediation of social skill deficits commonly seen in children with ADHD.  相似文献   

6.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
The ability to identify facial expressions of happiness, sadness, anger, surprise,fear, and disgust was studied in 48 nondisabled children and 76 children with learning disabilities aged 9 through 12. On the basis of their performance on the Rey Auditory-Verbal Learning Test and the Benton Visual Retention Test, the LD group was divided into three subgroups: those with verbal deficits (VD), nonverbal deficits (NVD), and both verbal and nonverbal (BD) deficits. The measure of ability to interpret facial expressions of affect was a shortened version of Ekman and Friesen's Pictures of Facial Affect. Overall, the nondisabled group had better interpretive ability than the three learning disabled groups and the VD group had better ability than the NVD and BD groups. Although the identification level of the nondisabled group differed from that of the VD group only for surprise, it was superior to that of the NVD and BD groups for four of the six emotions. Happiness was the easiest to identify, and the remaining emotions in ascending order of difficulty were anger, surprise, sadness, fear, and disgust. Older subjects did better than younger ones only for fear and disgust, and boys and girls did not differ in interpretive ability. These findings are discussed in terms of the need to take note of the heterogeneity of the learning disabled population and the particular vulnerability to social imperception of children with nonverbal deficits.  相似文献   

8.
People with Huntington's disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

9.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Examined whether spontaneous facial expressions provide observers with sufficient information to distinguish accurately which of 7 affective states (6 emotional and 1 neutral) is being experienced by another person. Six undergraduate senders' facial expressions were covertly videotaped as they watched emotionally loaded slides. After each slide, senders nominated the emotions term that best described their affective reaction and also rated the pleasantness and strength of that reaction. Similar nominations of emotion terms and ratings were later made by 53 undergraduate receivers who viewed the senders' videotaped facial expression. The central measure of communication accuracy was the match between senders' and receivers' emotion nominations. Overall accuracy was significantly greater than chance, although it was not impressive in absolute terms. Only happy, angry, and disgusted expressions were recognized at above-chance rates, whereas surprised expressions were recognized at rates that were significantly worse than chance. Female Ss were significantly better senders than were male Ss. Although neither sex was found to be better at receiving facial expressions, female Ss were better receivers of female senders' expressions than of male senders' expressions. Female senders' neutral and surprised expressions were more accurately recognized than were those of male senders. The only sex difference found for decoding emotions was a tendency for male Ss to be more accurate at recognizing anger. (25 ref) (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

11.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

12.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

13.
The most familiar emotional signals consist of faces, voices, and whole-body expressions, but so far research on emotions expressed by the whole body is sparse. The authors investigated recognition of whole-body expressions of emotion in three experiments. In the first experiment, participants performed a body expression-matching task. Results indicate good recognition of all emotions, with fear being the hardest to recognize. In the second experiment, two alternative forced choice categorizations of the facial expression of a compound face-body stimulus were strongly influenced by the bodily expression. This effect was a function of the ambiguity of the facial expression. In the third experiment, recognition of emotional tone of voice was similarly influenced by task irrelevant emotional body expressions. Taken together, the findings illustrate the importance of emotional whole-body expressions in communication either when viewed on their own or, as is often the case in realistic circumstances, in combination with facial expressions and emotional voices. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
In this set of studies, we examine the perceptual similarities between emotions that share either a valence or a motivational direction. Determination is a positive approach-related emotion, whereas anger is a negative approach-related emotion. Thus, determination and anger share a motivational direction but are opposite in valence. An implemental mind-set has previously been shown to produce high-approach-motivated positive affect. Thus, in Study 1, participants were asked to freely report the strongest emotion they experienced during an implemental mind-set. The most common emotion reported was determination. On the basis of this result, we compared the facial expression of determination with that of anger. In Study 2, naive judges were asked to identify photographs of facial expressions intended to express determination, along with photographs intended to express basic emotions (joy, anger, sadness, fear, disgust, neutral). Correct identifications of intended determination expressions were correlated with misidentifications of the expressions as anger but not with misidentifications as any other emotion. This suggests that determination, a high-approach-motivated positive affect, is perceived as similar to anger. In Study 3, naive judges quantified the intensity of joy, anger, and determination expressed in photographs. The intensity of perceived determination was directly correlated with the intensity of perceived anger (a high-approach-motivated negative affect) and was inversely correlated with the intensity of perceived joy (a low-approach-motivated positive affect). These results demonstrate perceptual similarity between emotions that share a motivational direction but differ in valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Participants viewed “hybrid” faces that showed a facial expression (anger, fear, happiness, or sadness) only in the lowest spatial frequency (1–6 cycles/image), which was blended with the same face's neutral expression in the rest of the bandwidth (7–128 cycles/image). Participants rated the portrayed persons (compared to neutral images) as “friendly” when the lowest spatial frequencies showed a positive expression and “unfriendly” when the lowest spatial frequencies showed negative expressions. In contrast, the same hybrid images were explicitly judged as neutral and their “hidden” emotional expressions could not be explicitly recognized, as also confirmed by d′ sensitivity measures. Finally, one patient (SS) who had the left anterior temporal lobe surgically resected (including the amygdala), failed to show the above described unconscious effects on friendliness judgments when viewing “afraid” and “sad” hybrid faces. We conclude that the lowest spatial frequencies of facial expressions can evoke “core” emotions without knowledge or awareness of a specific emotion but these core emotions can convey a clear “impression” of a person's character. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
High- and low-trait socially anxious individuals classified the emotional expressions of photographic quality continua of interpolated ("morphed") facial images that were derived from combining 6 basic prototype emotional expressions to various degrees, with the 2 adjacent emotions arranged in an emotion hexagon. When fear was 1 of the 2 component emotions, the high-trait group displayed enhanced sensitivity for fear. In a 2nd experiment where a mood manipulation was incorporated, again, the high-trait group exhibited enhanced sensitivity for fear. The low-trait group was sensitive for happiness in the control condition. The moodmanipulated group had increased sensitivity for anger expressions, and trait anxiety did not moderate these effects. Interpretations of the results related to the classification of fearful expressions are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Can young children report coherently on their emotions, and how do their reports contribute to our understanding of emotional development? Two-hundred six children ages 3 to 6 years participated in structured laboratory tasks designed to elicit a range of positive and negative emotions and indicated their emotional state following each task. Children's reports of their emotions meaningfully varied along with the nature of the different tasks during which they were collected (i.e., reports of negative and positive emotions differed across tasks designed to elicit those states). There were no sex differences on reports of any emotion and only small age differences. Multilevel modeling analyses demonstrated that children's self-reports of each emotion converged significantly with objective coding of expressions of those emotions across laboratory tasks; higher convergence for some emotions was associated with older age, higher verbal intelligence, and greater emotion-recognition abilities. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Facial expression and emotional stimuli were varied orthogonally in a 3?×?4 factorial design to test whether facial expression is necessary or sufficient to influence emotional experience. 123 undergraduates watched a film eliciting fear, sadness, or no emotion while holding their facial muscles in the position characteristic of fear or sadness or in an effortful but nonemotional grimace; those in a 4th group received no facial instructions. The Ss believed that the study concerned subliminal perception and that the facial positions were necessary to prevent physiological recording artifacts. The films had powerful effects on reported emotions, the facial expressions none. Correlations between facial expression and reported emotion were zero. Sad and fearful Ss showed distinctive patterns of physiological arousal. Facial expression also tended to affect physiological responses in a manner consistent with an effort hypothesis. (33 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Decoding facial expressions of emotion is an important aspect of social communication that is often impaired following psychiatric or neurological illness. However, little is known of the cognitive components involved in perceiving emotional expressions. Three dual task studies explored the role of verbal working memory in decoding emotions. Concurrent working memory load substantially interfered with choosing which emotional label described a facial expression (Experiment 1). A key factor in the magnitude of interference was the number of emotion labels from which to choose (Experiment 2). In contrast the ability to decide that two faces represented the same emotion in a discrimination task was relatively unaffected by concurrent working memory load (Experiment 3). Different methods of assessing emotion perception make substantially different demands on working memory. Implications for clinical disorders which affect both working memory and emotion perception are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号