首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
In earlier work, the authors analyzed emotion portrayals by professional actors separately for facial expression, vocal expression, gestures, and body movements. In a secondary analysis of the combined data set for all these modalities, the authors now examine to what extent actors use prototypical multimodal configurations of expressive actions to portray different emotions, as predicted by basic emotion theories claiming that expressions are produced by fixed neuromotor affect programs. Although several coherent unimodal clusters are identified, the results show only 3 multimodal clusters: agitation, resignation, and joyful surprise, with only the latter being specific to a particular emotion. Finding variable expressions rather than prototypical patterns seems consistent with the notion that emotional expression is differentially driven by the results of sequential appraisal checks, as postulated by componential appraisal theories. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
The authors investigated the understanding of emotion dissimulation in school-age children. Sixty participants were read short stories in which a main character expressed an emotion or hid an emotion from other characters. The participants were asked to identify the emotion felt by the main characters and to indicate the facial expressions they would display. Then they were asked what emotions the main characters felt while they were displaying these expressions, and what the beliefs of the other story characters would be as to the emotion felt by the main characters. The results revealed that children from 5 to 6 years of age have a partial understanding of emotion dissimulation. They were accurate in finding the emotion felt by the main characters when questioned the first time. They were also accurate in choosing the expressions the main characters would display to hide their emotions. However, they were often inaccurate as to the felt emotions of the main characters when questioned the second time. Compared with 9- and 10-year-olds, the younger children had more difficulty understanding the simultaneous character of felt and displayed emotions. Five- and 6-year-olds were also less accurate than the older children when asked to indicate the beliefs of the other characters in stories where felt emotions were hidden. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
4.
5.
One reason for the universal appeal of music lies in the emotional rewards that music offers to its listeners. But what makes these rewards so special? The authors addressed this question by progressively characterizing music-induced emotions in 4 interrelated studies. Studies 1 and 2 (n = 354) were conducted to compile a list of music-relevant emotion terms and to study the frequency of both felt and perceived emotions across 5 groups of listeners with distinct music preferences. Emotional responses varied greatly according to musical genre and type of response (felt vs. perceived). Study 3 (n = 801)--a field study carried out during a music festival--examined the structure of music-induced emotions via confirmatory factor analysis of emotion ratings, resulting in a 9-factorial model of music-induced emotions. Study 4 (n = 238) replicated this model and found that it accounted for music-elicited emotions better than the basic emotion and dimensional emotion models. A domain-specific device to measure musically induced emotions is introduced--the Geneva Emotional Music Scale. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Studied the extent to which communication channel affects judgments of the type and authenticity of emotions. 80 university students (mean age 21.5 yrs) were presented with short audio, video, and audiovideo excerpts of actors expressing specific emotions. In some cases, the emotion was actually experienced by the actor; in other cases, the emotion was simulated. Ss were distributed over 8 communication channel conditions (i.e., facial, audio, filtered audio, gestural?+?facial, facial?+?filtered audio, facial?+?audio, gestural?+?facial?+?filtered audio, and gestural?+?facial?+?audio) and asked to judge the emotional category (i.e., happiness, fear, anger, surprise, and sadness) and the authenticity of the emotion. The accuracy of the judgments was analyzed in relation to the type of channel and the type of emotion. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
This investigation represents a multimodal study of age-related differences in experienced and expressed affect and in emotion regulatory skills in a sample of young, middle-aged, and older adults (N = 96), testing formulations derived from differential emotions theory. The experimental session consisted of a 10-min anger induction and a 10-min sadness induction using a relived emotion task; participants were also randomly assigned to an inhibition or noninhibition condition. In addition to subjective ratings of emotional experience provided by participants, their facial behavior was coded using an objective facial affect coding system; a content analysis also was applied to the emotion narratives. Separate repeated measures analyses of variance applied to each emotion domain indicated age differences in the co-occurrence of negative emotions and co-occurrence of positive and negative emotions across domains, thus extending the finding of emotion heterogeneity or complexity in emotion experience to facial behavior and verbal narratives. The authors also found that the inhibition condition resulted in a different pattern of results in the older versus middle-aged and younger adults. The intensity and frequency of discrete emotions were similar across age groups, with a few exceptions. Overall, the findings were generally consistent with differential emotions theory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Metacognitive emotion regulation strategies involve deliberately changing thoughts or goals to alleviate negative emotions. Adults commonly engage in this type of emotion regulation, but little is known about the developmental roots of this ability. Two studies were designed to assess whether 5- and 6-year-old children can generate such strategies and, if so, the types of metacognitive strategies they use. In Study 1, children described how story protagonists could alleviate negative emotions. In Study 2, children recalled times that they personally had felt sad, angry, and scared and described how they had regulated their emotions. In contrast to research suggesting that young children cannot use metacognitive regulation strategies, the majority of children in both studies described such strategies. Children were surprisingly sophisticated in their suggestions for how to cope with negative emotions and tailored their regulatory responses to specific emotional situations. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Studied issues pertinent to judgment studies in nonverbal communication and to the perception and attribution of emotions by investigating which behavioral cues are used in portraying various emotions and to what extent presentation and encoding differences between actors affect judgment accuracy. The nonverbal behaviors of 6 trained actors (3 male, 3 female) portraying 4 emotions (joy, sadness, anger, and surprise) were analyzed from a videotape. These portrayals were shown using 4 channels of presentation (audio-video, video only, audio only, and filtered audio), to groups of naive judges (college students) to study decoding. Results indicate that different nonverbal cues are used to portray the various emotions and that differences between channels and between actors strongly affect decoding accuracy. Overemphasis of behavioral cues characteristic for certain emotions was found to result in reduced decoding accuracy. (19 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
The different assumptions made by discrete and componential emotion theories about the nature of the facial expression of emotion and the underlying mechanisms are reviewed. Explicit and implicit predictions are derived from each model. It is argued that experimental expression-production paradigms rather than recognition studies are required to critically test these differential predictions. Data from a large-scale actor portrayal study are reported to demonstrate the utility of this approach. The frequencies with which 12 professional actors use major facial muscle actions individually and in combination to express 14 major emotions show little evidence for emotion-specific prototypical affect programs. Rather, the results encourage empirical investigation of componential emotion model predictions of dynamic configurations of appraisal-driven adaptive facial actions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
This article discusses the controversy over whether attribution (recognition) of emotions from facial expressions is universal (P. Ekman, 1994; C. E. Izard, 1994; J. A. Russell, 1994). Agreement emerged on various issues. There exists at least Minimal Universality (people everywhere can infer something about others from their facial behavior). Anger, sadness, and other semantic categories for emotion are not pancultural and are not the precise messages conveyed by facial expressions. Emotions can occur without facial expressions, and facial expressions can occur without emotions. Further evidence is needed to determine the relationship between emotion and facial behavior, what determines that relationship, how facial behavior is interpreted, and how much the interpretation varies with culture and language. Ekman's (1994) objections are answered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
It has been proposed that self-face representations are involved in interpreting facial emotions of others. We experimentally primed participants' self-face representations. In Study 1, we assessed eye tracking patterns and performance on a facial emotion discrimination task, and in Study 2, we assessed emotion ratings between self and nonself groups. Results show that experimental priming of self-face representations increases visual exploration of faces, facilitates the speed of facial expression processing, and increases the emotional distance between expressions. These findings suggest that the ability to interpret facial expressions of others is intimately associated with the representations we have of our own faces. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

16.
In this set of studies, we examine the perceptual similarities between emotions that share either a valence or a motivational direction. Determination is a positive approach-related emotion, whereas anger is a negative approach-related emotion. Thus, determination and anger share a motivational direction but are opposite in valence. An implemental mind-set has previously been shown to produce high-approach-motivated positive affect. Thus, in Study 1, participants were asked to freely report the strongest emotion they experienced during an implemental mind-set. The most common emotion reported was determination. On the basis of this result, we compared the facial expression of determination with that of anger. In Study 2, naive judges were asked to identify photographs of facial expressions intended to express determination, along with photographs intended to express basic emotions (joy, anger, sadness, fear, disgust, neutral). Correct identifications of intended determination expressions were correlated with misidentifications of the expressions as anger but not with misidentifications as any other emotion. This suggests that determination, a high-approach-motivated positive affect, is perceived as similar to anger. In Study 3, naive judges quantified the intensity of joy, anger, and determination expressed in photographs. The intensity of perceived determination was directly correlated with the intensity of perceived anger (a high-approach-motivated negative affect) and was inversely correlated with the intensity of perceived joy (a low-approach-motivated positive affect). These results demonstrate perceptual similarity between emotions that share a motivational direction but differ in valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
In Study 1, 160 individuals from kindergarten (kd), 3rd, 6th, 9th, and college grades were presented story protagonists who facially expressed or did not express sadness/fear when encountering events that likely caused (relevant-inhibitory cause) or did not cause (irrelevant cause) the inhibition of the expression of emotion. In Study 2, 108 kd, 3rd-, and 6th-grade children viewed peers engaging in real interactions similar to the stories. In both studies, kindergartners judged that relevant-inhibitory causes decreased a peer's emotions. Older individuals displayed an understanding of the inhibition of emotional expression by ascribing greater emotion to the peer under relevant-inhibitory than irrelevant causal conditions. In Study 2, age differences in children's search for social information and prosocial behavior paralleled judgments of emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Dominance and submission constitute fundamentally different social interaction strategies that may be enacted most effectively to the extent that the emotions of others are relatively ignored (dominance) versus noticed (submission). On the basis of such considerations, we hypothesized a systematic relationship between chronic tendencies toward high versus low levels of interpersonal dominance and emotion decoding accuracy in objective tasks. In two studies (total N = 232), interpersonally dominant individuals exhibited poorer levels of emotion recognition in response to audio and video clips (Study 1) and facial expressions of emotion (Study 2). The results provide a novel perspective on interpersonal dominance, suggest its strategic nature (Study 2), and are discussed in relation to Fiske's (1993) social–cognitive theory of power. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

20.
Self-conscious emotions such as embarrassment and shame are associated with 2 aspects of theory of mind (ToM): (a) the ability to understand that behavior has social consequences in the eyes of others and (b) an understanding of social norms violations. The present study aimed to link ToM with the recognition of self-conscious emotion. Children with and without autism identified facial expressions conscious of self-conscious and non-self-conscious emotions from photographs. ToM was also measured. Children with autism performed more poorly than comparison children at identifying self-conscious emotions, though they did not differ in the recognition of non-self-conscious emotions. When ToM ability was statistically controlled, group differences in the recognition of self-conscious emotion disappeared. Discussion focused on the links between ToM and self-conscious emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号