首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Little research has focused on children's decoding of emotional meaning in expressive body movement: none has considered which movement cues children use to detect emotional meaning. The current study investigated the general ability to decode happiness, sadness, anger, and fear in dance forms of expressive body movement and the specific ability to detect differences in the intensity of anger and happiness when the relative amount of movement cue specifying each emotion was systematically varied. Four-year-olds (n = 25), 5-year-olds (n = 25), 8-year-olds (n = 29), and adults (n = 24) completed an emotion contrast task and 2 emotion intensity tasks. Decoding ability exceeding chance levels was demonstrated for sadness by 4-year-olds; for sadness, fear, and happiness by 5-year-olds: and for all emotions by 8-year-olds and adults. Children as young as 5 years were shown to rely on emotion-specific movement cues in their decoding of anger and happiness intensity. The theoretical significance of these effects across development is discussed.  相似文献   

4.
Studied the development of the recognition of emotional facial expressions in children and of the factors influencing recognition accuracy. 80 elementary school students (aged 5–8 yrs) were asked to identify the emotions expressed in a series of facial photographs. Recognition performances were analyzed in relation to the type of emotion expressed (i.e., happiness, fear, anger, surprise, sadness, or disgust) and the intensity of the emotional expression. Age differences were determined. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Investigated the degree to which 4–5 yr olds (n?=?48) can enact expressions of emotion recognizable by peers and adults; the study also examined whether accuracy of recognition was a function of age and whether the expression was posed or spontaneous. Adults (n?=?103) were much more accurate than children in recognizing neutral states, slightly more accurate in recognizing happiness and anger, and equally accurate in recognizing sadness. Children's spontaneous displays of happiness were more recognizable than posed displays, but for other emotions there was no difference between the recognizability of posed and spontaneous expressions. Children were highly accurate in identifying the facial expressions of happiness, sadness, and anger displayed by their peers. Sex and ethnicity of the child whose emotion was displayed interacted to influence only adults' recognizability of anger. Results are discussed in terms of the social learning and cognitive developmental factors influencing (a) adults' and children's decoding (recognition) of emotional expressions in young children and (b) encoding (posing) of emotional expressions by young children. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Little research has focused on children's decoding of emotional meaning in expressive body movement; none has considered which movement cues children use to detect emotional meaning. The current study investigated the general ability to decode happiness, sadness, anger, and fear in dance forms of expressive body movement and the specific ability to detect differences in the intensity of anger and happiness when the relative amount of movement cue specifying each emotion was systematically varied. Four-year-olds (n?=?25), 5-year-olds (n?=?25), 8-year-olds (n?=?29), and adults (n?=?24) completed an emotion contrast task and 2 emotion intensity tasks. Decoding ability exceeding chance levels was demonstrated for sadness by 4-year-olds; for sadness, fear, and happiness by 5-year-olds; and for all emotions by 8-year-olds and adults. Children as young as 5 years were shown to rely on emotion-specific movement cues in their decoding of anger and happiness intensity. The theoretical significance of these effects across development is discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
The ability to perceive and interpret facial expressions of emotion improves throughout childhood. Although newborns have rudimentary perceptive abilities allowing them to distinguish several facial expressions, it is only at the end of the first year that infants seem to be able to assign meaning to emotional signals. The meaning infants assign to facial expressions is very broad, as it is limited to the judgment of emotional valence. Meaning becomes more specific between the second and the third year of life, as children begin to categorize facial signals in terms of discrete emotions. While the facial expressions of happiness, anger and sadness are accurately categorized by the third year, the categorization of expressions of fear, surprise and disgust shows a much slower developmental pattern. Moreover, the ability to judge the sincerity of facial expressions shows a slower developmental pattern, probably because of the subtle differences between genuine and non-genuine expressions. The available evidence indicates that school age children can distinguish genuine smiles from masked smiles and false smiles. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Fifty children and adolescents were tested for their ability to recognize the 6 basic facial expressions of emotion depicted in Ekman and Friesen's normed photographs. Subjects were presented with sets of 6 photographs of faces, each portraying a different basic emotion, and stories portraying those emotions were read to them. After each story, the subject was asked to point to the photograph in the set that depicted the emotion described. Overall, the children correctly identified the emotions on 74% of the presentations. The highest level of accuracy in recognition was for happiness, followed by sadness, with fear being the emotional expression that was mistaken most often. When compared to studies of children in the general population, children with ADHD have deficits in their ability to accurately recognize facial expressions of emotion. These findings have important implications for the remediation of social skill deficits commonly seen in children with ADHD.  相似文献   

11.
Studied the extent to which communication channel affects judgments of the type and authenticity of emotions. 80 university students (mean age 21.5 yrs) were presented with short audio, video, and audiovideo excerpts of actors expressing specific emotions. In some cases, the emotion was actually experienced by the actor; in other cases, the emotion was simulated. Ss were distributed over 8 communication channel conditions (i.e., facial, audio, filtered audio, gestural?+?facial, facial?+?filtered audio, facial?+?audio, gestural?+?facial?+?filtered audio, and gestural?+?facial?+?audio) and asked to judge the emotional category (i.e., happiness, fear, anger, surprise, and sadness) and the authenticity of the emotion. The accuracy of the judgments was analyzed in relation to the type of channel and the type of emotion. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Facial expression and emotional stimuli were varied orthogonally in a 3?×?4 factorial design to test whether facial expression is necessary or sufficient to influence emotional experience. 123 undergraduates watched a film eliciting fear, sadness, or no emotion while holding their facial muscles in the position characteristic of fear or sadness or in an effortful but nonemotional grimace; those in a 4th group received no facial instructions. The Ss believed that the study concerned subliminal perception and that the facial positions were necessary to prevent physiological recording artifacts. The films had powerful effects on reported emotions, the facial expressions none. Correlations between facial expression and reported emotion were zero. Sad and fearful Ss showed distinctive patterns of physiological arousal. Facial expression also tended to affect physiological responses in a manner consistent with an effort hypothesis. (33 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
The facial expressions of 28 13-mo-old middle-class children were videotaped during the 3-min separation episode of the Ainsworth strange-situation procedure (ASSP). Facial behavior was analyzed to determine the patterns of emotional expressions during separation and to assess the relations between these patterns and types of attachment as assessed by the ASSP. Findings reveal that anger was the dominant negative emotion expressed by the majority of Ss in each of 3 ad hoc groups determined by level of negative emotion. Some high-negative emotion expressers displayed predominantly anger and others mainly sadness. Patterns of emotion expression varied with type of attachment; Ss who showed an insecure-resistant attachment pattern displayed less interest and more sadness than Ss in the securely attached groups. The proportion of time anger was expressed did not differ significantly with type of attachment. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

15.
This investigation represents a multimodal study of age-related differences in experienced and expressed affect and in emotion regulatory skills in a sample of young, middle-aged, and older adults (N = 96), testing formulations derived from differential emotions theory. The experimental session consisted of a 10-min anger induction and a 10-min sadness induction using a relived emotion task; participants were also randomly assigned to an inhibition or noninhibition condition. In addition to subjective ratings of emotional experience provided by participants, their facial behavior was coded using an objective facial affect coding system; a content analysis also was applied to the emotion narratives. Separate repeated measures analyses of variance applied to each emotion domain indicated age differences in the co-occurrence of negative emotions and co-occurrence of positive and negative emotions across domains, thus extending the finding of emotion heterogeneity or complexity in emotion experience to facial behavior and verbal narratives. The authors also found that the inhibition condition resulted in a different pattern of results in the older versus middle-aged and younger adults. The intensity and frequency of discrete emotions were similar across age groups, with a few exceptions. Overall, the findings were generally consistent with differential emotions theory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

17.
A total of 74 Ss were induced to adopt expressions of fear, anger, disgust, and sadness in Experiment 1. Each expression significantly increased feelings of its particular emotion compared with at least two of the others, a result that cannot be explained by a single dimension. Postures should play the same role in emotional experience as facial expressions. However, the demonstrated effects of postures (Riskind, 1984) could also represent a single dimension of variation. In Experiment 2, subjects were induced to adopt postures characteristic of fear, anger, and sadness. Again, the effects were specific to the postures. These two studies indicate that emotional behavior produces changes in feelings that specifically match the behavior. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
20 male undergraduates role played 4 discrete emotions vocally using sentences whose semantic content was emotionally appropriate or affectively neutral. 48 undergraduate listeners attempted to identify the emotions from content-filtered recordings. With one exception, all emotions were recognized at above-chance levels; sadness was more accurately identified than anger, happiness, or surprise. However, an interaction revealed that the effect of semantic content depended on the emotion being expressed. For instance, semantically emotional (compared to neutral) material aided in the identification of sadness; however, the opposite was true for anger. Multidimensional scaling of listeners' confusions revealed 2 underlying affective dimensions termed "pleasantness" and "energy level." Analyses of dimensional coordinates indicated that regardless of affect, stimuli with neutral semantic content were perceived as having greater energy than those with emotionally appropriate content. (38 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Studied differences between dimensional and categorical judgments of static and dynamic spontaneous facial expressions of emotion. In the 1st part of the study, 25 university students presented with either static or dynamic facial expressions of emotions (i.e., joy, fear, anger, surprise, disgust, and sadness) and asked to evaluate the similarity of 21 pairs of stimuli on a 7-point scale. Results were analyzed using a multidimensional scaling procedure. In the 2nd part of the study, Ss were asked to categorize the expressed emotions according to their intensity. Differences in the categorization of static and dynamic stimuli were analyzed. Results from the similarity rating task and the categorization task were compared. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号