首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Two studies test the assertion that anger, sadness, fear, pride, and happiness are typically narrated in different ways. Everyday events eliciting these 5 emotions were narrated by young women (Study 1) and 5- and 8-year-old girls (Study 2). Negative narratives were expected to engender more effort to process the event, be longer, more grammatically complex, more often have a complication section, and use more specific emotion labels than global evaluations. Narratives of Hogan’s (2003) juncture emotions anger and fear were expected to focus more on action and to contain more core narrative sections of orientation, complication, and resolution than narratives of the outcome emotions sadness and happiness. Hypotheses were confirmed for adults except for syntactic complexity, whereas children showed only some of these differences. Hogan’s theory that juncture emotions are restricted to the complication section was not confirmed. Finally, in adults, indirect speech was more frequent in anger narratives and internal monologue in fear narratives. It is concluded that different emotions should be studied in how they are narrated, and that narratives should be analyzed according to qualitatively different emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

5.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
The communication of emotion via touch.   总被引:1,自引:0,他引:1  
The study of emotional communication has focused predominantly on the facial and vocal channels but has ignored the tactile channel. Participants in the current study were allowed to touch an unacquainted partner on the whole body to communicate distinct emotions. Of interest was how accurately the person being touched decoded the intended emotions without seeing the tactile stimulation. The data indicated that anger, fear, disgust, love, gratitude, and sympathy were decoded at greater than chance levels, as well as happiness and sadness, 2 emotions that have not been shown to be communicated by touch to date. Moreover, fine-grained coding documented specific touch behaviors associated with different emotions. The findings are discussed in terms of their contribution to the study of emotion-related communication. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Prior research has typically attempted to distinguish one emotion from another by identifying distinctive expressions, physiology, and subjective qualities. Recent theories claim emotions can also be differentiated by distinctive action tendencies, actions, and motivational goals. To test hypotheses from both older and more recent theories, 100 Ss were asked to recall experiences of particular negative emotions and answer questions concerning what they felt, thought, felt like doing, actually did, and wanted. Results support hypotheses specifying characteristic responses for fear, sadness, distress, frustration, disgust, dislike, anger, regret, guilt, and shame. The findings indicate that discrete emotions have distinctive goals and action tendencies, as well as thoughts and feelings. In addition, they provide empirical support for hypothesized emotion states that have received insufficient attention from researchers. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Little research has focused on children's decoding of emotional meaning in expressive body movement: none has considered which movement cues children use to detect emotional meaning. The current study investigated the general ability to decode happiness, sadness, anger, and fear in dance forms of expressive body movement and the specific ability to detect differences in the intensity of anger and happiness when the relative amount of movement cue specifying each emotion was systematically varied. Four-year-olds (n = 25), 5-year-olds (n = 25), 8-year-olds (n = 29), and adults (n = 24) completed an emotion contrast task and 2 emotion intensity tasks. Decoding ability exceeding chance levels was demonstrated for sadness by 4-year-olds; for sadness, fear, and happiness by 5-year-olds: and for all emotions by 8-year-olds and adults. Children as young as 5 years were shown to rely on emotion-specific movement cues in their decoding of anger and happiness intensity. The theoretical significance of these effects across development is discussed.  相似文献   

10.
Two studies examined the hypothesized status of appraisals, relative to attributions, as proximal antecedents of emotion. In Study 1, which looked at 6 emotions (happiness, hope-challenge, anger, guilt, fear-anxiety, and sadness), 136 undergraduates reported on their attributions, appraisals, and emotions during past encounters associated with a variety of situations. In Study 2, which focused on anger and guilt, 120 undergraduates reported on these same variables in response to experimenter-supplied vignettes that systematically manipulated theoretically relevant attributions. The results of both studies indicated that the emotions were more directly related to appraisals than they were to attributions, and Study 2 provided evidence that appraisal serves as a mediator between attribution and emotional response. These findings lend support to the hypothesized status of appraisal as the most proximal cognitive antecedent of emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
In this set of studies, we examine the perceptual similarities between emotions that share either a valence or a motivational direction. Determination is a positive approach-related emotion, whereas anger is a negative approach-related emotion. Thus, determination and anger share a motivational direction but are opposite in valence. An implemental mind-set has previously been shown to produce high-approach-motivated positive affect. Thus, in Study 1, participants were asked to freely report the strongest emotion they experienced during an implemental mind-set. The most common emotion reported was determination. On the basis of this result, we compared the facial expression of determination with that of anger. In Study 2, naive judges were asked to identify photographs of facial expressions intended to express determination, along with photographs intended to express basic emotions (joy, anger, sadness, fear, disgust, neutral). Correct identifications of intended determination expressions were correlated with misidentifications of the expressions as anger but not with misidentifications as any other emotion. This suggests that determination, a high-approach-motivated positive affect, is perceived as similar to anger. In Study 3, naive judges quantified the intensity of joy, anger, and determination expressed in photographs. The intensity of perceived determination was directly correlated with the intensity of perceived anger (a high-approach-motivated negative affect) and was inversely correlated with the intensity of perceived joy (a low-approach-motivated positive affect). These results demonstrate perceptual similarity between emotions that share a motivational direction but differ in valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
We report two studies validating a new standardized set of filmed emotion expressions, the Amsterdam Dynamic Facial Expression Set (ADFES). The ADFES is distinct from existing datasets in that it includes a face-forward version and two different head-turning versions (faces turning toward and away from viewers), North-European as well as Mediterranean models (male and female), and nine discrete emotions (joy, anger, fear, sadness, surprise, disgust, contempt, pride, and embarrassment). Study 1 showed that the ADFES received excellent recognition scores. Recognition was affected by social categorization of the model: displays of North-European models were better recognized by Dutch participants, suggesting an ingroup advantage. Head-turning did not affect recognition accuracy. Study 2 showed that participants more strongly perceived themselves to be the cause of the other's emotion when the model's face turned toward the respondents. The ADFES provides new avenues for research on emotion expression and is available for researchers upon request. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

13.
Two studies investigated the role of expressive vocal behavior (specifically, speech rate and loudness) in fear and anxiety and in sadness and depression. In the 1st study, participants spoke about personally experienced fear and anxiety-arousing and neutral events using 3 different voice styles: fast and loud, normal, and slow and soft. In the 2nd study, participants spoke about personally experienced sad or depressing and neutral events using the same 3 voice styles. In both studies, the participants' highest levels of subjective affective and cardiovascular (CV) arousal occurred when they spoke about the emotional events in a mood-congruent voice style: fast and loud in the case of fear and anxiety, and slow and soft in the case of sadness or depression. Mood-incongruent voice styles canceled the heightened levels of CV arousal normally associated with these negative emotions. The voice-style manipulation had no significant effect on the participants' levels of CV arousal during the neutral discussions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Little research has focused on children's decoding of emotional meaning in expressive body movement; none has considered which movement cues children use to detect emotional meaning. The current study investigated the general ability to decode happiness, sadness, anger, and fear in dance forms of expressive body movement and the specific ability to detect differences in the intensity of anger and happiness when the relative amount of movement cue specifying each emotion was systematically varied. Four-year-olds (n?=?25), 5-year-olds (n?=?25), 8-year-olds (n?=?29), and adults (n?=?24) completed an emotion contrast task and 2 emotion intensity tasks. Decoding ability exceeding chance levels was demonstrated for sadness by 4-year-olds; for sadness, fear, and happiness by 5-year-olds; and for all emotions by 8-year-olds and adults. Children as young as 5 years were shown to rely on emotion-specific movement cues in their decoding of anger and happiness intensity. The theoretical significance of these effects across development is discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
This exploratory study aims at investigating the effects of terrorism on children’s ability to recognize emotions. A sample of 101 exposed and 102 nonexposed children (mean age = 11 years), balanced for age and gender, were assessed 20 months after a terrorist attack in Beslan, Russia. Two trials controlled for children’s ability to match a facial emotional stimulus with an emotional label and their ability to match an emotional label with an emotional context. The experimental trial evaluated the relation between exposure to terrorism and children’s free labeling of mixed emotion facial stimuli created by morphing between 2 prototypical emotions. Repeated measures analyses of covariance revealed that exposed children correctly recognized pure emotions. Four log-linear models were performed to explore the association between exposure group and category of answer given in response to different mixed emotion facial stimuli. Model parameters indicated that, compared with nonexposed children, exposed children (a) labeled facial expressions containing anger and sadness significantly more often than expected as anger, and (b) produced fewer correct answers in response to stimuli containing sadness as a target emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Recent research has shown that pride, like the "basic" emotions of anger, disgust, fear, happiness, sadness, and surprise, has a distinct, nonverbal expression that can be recognized by adults (J. L. Tracy & R. W. Robins, 2004b). In 2 experiments, the authors examined whether young children can identify the pride expression and distinguish it from expressions of happiness and surprise. Results suggest that (a) children can recognize pride at above-chance levels by age 4 years; (b) children recognize pride as well as they recognize happiness; (c) pride recognition, like happiness and surprise recognition, improves from age 3 to 7 years; and (d) children's ability to recognize pride cannot be accounted for by the use of a process of elimination (i.e., an exclusion rule) to identify an unknown entity. These findings have implications for the development of emotion recognition and children's ability to perceive and communicate pride. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
This study investigates emotional display rules for seven basic emotions. The main goal was to compare emotional display rules of Canadians, US Americans, and Japanese across as well as within cultures regarding the specific emotion, the type of interaction partner, and gender. A total of 835 university students participated in the study. The results indicate that Japanese display rules permit the expression of powerful (anger, contempt, and disgust) significantly less than those of the two North American samples. Japanese also think that they should express positive emotions (happiness, surprise) significantly less than the Canadian sample. Furthermore, Japanese varied the display rules for different interaction partners more than the two North American samples did only for powerful emotions. Gender differences were similar across all three cultural groups. Men expressed powerful emotions more than women and women expressed powerless emotions (sadness, fear) and happiness more than men. Depending on the type of emotion and interaction partner some shared display rules occurred across culture and gender. The implications of these findings are discussed in relation to cultural dimensions and other cultural characteristics. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Guided by appraisal-based models of the influence of emotion upon judgment, we propose that disgust moralizes—that is, amplifies the moral significance of—protecting the purity of the body and soul. Three studies documented that state and trait disgust, but not other negative emotions, moralize the purity moral domain but not the moral domains of justice or harm/care. In Study 1, integral feelings of disgust, but not integral anger, predicted stronger moral condemnation of behaviors violating purity. In Study 2, experimentally induced disgust, compared with induced sadness, increased condemnation of behaviors violating purity and increased approval of behaviors upholding purity. In Study 3, trait disgust, but not trait anger or trait fear, predicted stronger condemnation of purity violations and greater approval of behaviors upholding purity. We found that, confirming the domain specificity of the disgust–purity association, disgust was unrelated to moral judgments about justice (Studies 1 and 2) or harm/care (Study 3). Finally, across studies, individuals of lower socioeconomic status (SES) were more likely than individuals of higher SES to moralize purity but not justice or harm/care. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
The study of emotional signaling has focused almost exclusively on the face and voice. In 2 studies, the authors investigated whether people can identify emotions from the experience of being touched by a stranger on the arm (without seeing the touch). In the 3rd study, they investigated whether observers can identify emotions from watching someone being touched on the arm. Two kinds of evidence suggest that humans can communicate numerous emotions with touch. First, participants in the United States (Study 1) and Spain (Study 2) could decode anger, fear, disgust, love, gratitude, and sympathy via touch at much-better-than-chance levels. Second, fine-grained coding documented specific touch behaviors associated with different emotions. In Study 3, the authors provide evidence that participants can accurately decode distinct emotions by merely watching others communicate via touch. The findings are discussed in terms of their contributions to affective science and the evolution of altruism and cooperation. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号