首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Little research has focused on children's decoding of emotional meaning in expressive body movement: none has considered which movement cues children use to detect emotional meaning. The current study investigated the general ability to decode happiness, sadness, anger, and fear in dance forms of expressive body movement and the specific ability to detect differences in the intensity of anger and happiness when the relative amount of movement cue specifying each emotion was systematically varied. Four-year-olds (n = 25), 5-year-olds (n = 25), 8-year-olds (n = 29), and adults (n = 24) completed an emotion contrast task and 2 emotion intensity tasks. Decoding ability exceeding chance levels was demonstrated for sadness by 4-year-olds; for sadness, fear, and happiness by 5-year-olds: and for all emotions by 8-year-olds and adults. Children as young as 5 years were shown to rely on emotion-specific movement cues in their decoding of anger and happiness intensity. The theoretical significance of these effects across development is discussed.  相似文献   

2.
In the present research, we test the assumption that emotional mimicry and contagion are moderated by group membership. We report two studies using facial electromyography (EMG; Study 1), Facial Action Coding System (FACS; Study 2), and self-reported emotions (Study 2) as dependent measures. As predicted, both studies show that ingroup anger and fear displays were mimicked to a greater extent than outgroup displays of these emotions. The self-report data in Study 2 further showed specific divergent reactions to outgroup anger and fear displays. Outgroup anger evoked fear, and outgroup fear evoked aversion. Interestingly, mimicry increased liking for ingroup models but not for outgroup models. The findings are discussed in terms of the social functions of emotions in group contexts. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

3.
Little research has focused on children's decoding of emotional meaning in expressive body movement; none has considered which movement cues children use to detect emotional meaning. The current study investigated the general ability to decode happiness, sadness, anger, and fear in dance forms of expressive body movement and the specific ability to detect differences in the intensity of anger and happiness when the relative amount of movement cue specifying each emotion was systematically varied. Four-year-olds (n?=?25), 5-year-olds (n?=?25), 8-year-olds (n?=?29), and adults (n?=?24) completed an emotion contrast task and 2 emotion intensity tasks. Decoding ability exceeding chance levels was demonstrated for sadness by 4-year-olds; for sadness, fear, and happiness by 5-year-olds; and for all emotions by 8-year-olds and adults. Children as young as 5 years were shown to rely on emotion-specific movement cues in their decoding of anger and happiness intensity. The theoretical significance of these effects across development is discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
We studied the effects of lorazepam, a benzodiazepine, on differentiated emotions in healthy volunteers. In order to induce differentiated emotions, film excerpts were selected on the basis of the type of emotion they induced (fear, anger and for affective tone neutral film). For 6 days (D1 to D6), ten healthy volunteers received lorazepam (1 mg bid) or placebo in a randomized cross-over double-blind trial. During each treatment period, emotional induction occurred on D4, D5 and D6. One film excerpt (fear, anger or neutral) was presented each morning after relaxation. Evaluation was performed before and after each emotional induction and included questionnaires (Differential Emotions Scale and physical activation visual analog scales), and neurophysiological parameters (systolic and diastolic blood pressure, heart rate and norepinephrine levels). Globally, the film excerpts induced the predicted emotions. An analysis of variance was undertaken and revealed a significant effect of lorazepam versus placebo. On the Differential Emotions Scale and during fear induction, lorazepam induced a significantly higher increase in fear, anxiety and disgust emotions than placebo, whereas no effect was observed after anger induction. Lorazepam also induced a significantly higher increase in diastolic and systolic blood pressure with no change in heart rate, and physical activation items ("tears" and "faster breathing") without no significant change in norepinephrine. In conclusion, our results are consistent with an overall increase in emotional reactivity with lorazepam (1 mg bid) as compared to placebo. The pertinence of film-induced differentiated emotions has to be confirmed for clinical pharmacological use.  相似文献   

5.
The goals of this study were (a) to examine differing views on the relationship between self-report of emotion and physiological expression of emotion, (b) to differentiate between negative emotional contexts during imagery using facial electromyogram (EMG), and (c) to describe the facial muscle patterning and autonomic physiology of situations that involve expelling or avoiding disgusting sensory stimulation. Fifty subjects imagined situations eliciting disgust, anger, pleasure, and joy in 8-s trials using a tone-cued imagery procedure. Heart rate, skin conductance level, and facial EMG at the corrugator, zygomatic, and levator labii superioris/alesque muscle regions were recorded during imagery, and self-reports of emotion were collected after imagery trials. Self-reports of emotion produced results consistent with the affective categorization of the images. Activity at the levator labii region was higher during disgust than during anger imagery. Corrugator region increase characterized the negative as compared with the positive emotional contents, and activity at the zygomatic region was higher during joy imagery than during the other three emotions. Heart rate acceleration was greater during disgust, anger, and joy imagery than during pleasant imagery. Disgust imagery could be discriminated from anger imagery using facial EMG, and the expressive physiology of disgust was occasioned by the action set of active avoidance or rejection of sensory stimulation.  相似文献   

6.
Examined the influence of nondiscrepance and discrepance between situational and expressive cues on children's emotion recognition. Videotaped episodes in which actors portrayed emotions were presented to 96 4–8 yr old girls. When cues were nondiscrepant, Ss were better at all ages in recognizing happiness, fear, sadness, anger, and disgust than shame and contempt. This was interpreted as reflecting differences in complexity of emotions. When cues were discrepant, Ss preferred cues depicting the most recognizable emotion—that is, cues of simple emotions in preference to cues of complex emotions, and emotional cues in preference to neutral cues. Contrary to expectations, Ss did not rely on more salient cues in preference to less salient cues. Ss' responses to questions regarding the perceived cues reflected a developmental trend from noticing only 1 type of cue to considering both situational and expressive cues. This is interpreted as reflecting a development from centration to decentration. (27 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
In an experiment with 20 undergraduates, video recordings of actors' faces covered with black makeup and white spots were played back to the Ss so that only the white spots were visible. The results demonstrate that moving displays of happiness, sadness, fear, surprise, anger, and disgust were recognized more accurately than static displays of the white spots at the apex of the expressions. This indicates that facial motion, in the absence of information about the shape and position of facial features, is informative about these basic emotions. Normally illuminated dynamic displays of these expressions, however, were recognized more accurately than displays of moving spots. The relative effectiveness of upper and lower facial areas for the recognition of the 6 emotions was also investigated using normally illuminated and spots-only displays. In both instances, the results indicate that different facial regions are more informative for different emotions. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
This study investigated the hypothesis that different emotions are most effectively conveyed through specific, nonverbal channels of communication: body, face, and touch. Experiment 1 assessed the production of emotion displays. Participants generated nonverbal displays of 11 emotions, with and without channel restrictions. For both actual production and stated preferences, participants favored the body for embarrassment, guilt, pride, and shame; the face for anger, disgust, fear, happiness, and sadness; and touch for love and sympathy. When restricted to a single channel, participants were most confident about their communication when production was limited to the emotion's preferred channel. Experiment 2 examined the reception or identification of emotion displays. Participants viewed videos of emotions communicated in unrestricted and restricted conditions and identified the communicated emotions. Emotion identification in restricted conditions was most accurate when participants viewed emotions displayed via the emotion's preferred channel. This study provides converging evidence that some emotions are communicated predominantly through different nonverbal channels. Further analysis of these channel-emotion correspondences suggests that the social function of an emotion predicts its primary channel: The body channel promotes social-status emotions, the face channel supports survival emotions, and touch supports intimate emotions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

10.
High- and low-trait socially anxious individuals classified the emotional expressions of photographic quality continua of interpolated ("morphed") facial images that were derived from combining 6 basic prototype emotional expressions to various degrees, with the 2 adjacent emotions arranged in an emotion hexagon. When fear was 1 of the 2 component emotions, the high-trait group displayed enhanced sensitivity for fear. In a 2nd experiment where a mood manipulation was incorporated, again, the high-trait group exhibited enhanced sensitivity for fear. The low-trait group was sensitive for happiness in the control condition. The moodmanipulated group had increased sensitivity for anger expressions, and trait anxiety did not moderate these effects. Interpretations of the results related to the classification of fearful expressions are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Emotion theorists assume certain facial displays to convey information about the expresser's emotional state. In contrast, behavioral ecologists assume them to indicate behavioral intentions or action requests. To test these contrasting positions, over 2,000 online participants were presented with facial expressions and asked what they revealed--feeling states, behavioral intentions, or action requests. The majority of the observers chose feeling states as the message of facial expressions of disgust, fear, sadness, happiness, and surprise, supporting the emotions view. Only the anger display tended to elicit more choices of behavioral intention or action request, partially supporting the behavioral ecology view. The results support the view that facial expressions communicate emotions, with emotions being multicomponential phenomena that comprise feelings, intentions, and wishes. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Two studies test the assertion that anger, sadness, fear, pride, and happiness are typically narrated in different ways. Everyday events eliciting these 5 emotions were narrated by young women (Study 1) and 5- and 8-year-old girls (Study 2). Negative narratives were expected to engender more effort to process the event, be longer, more grammatically complex, more often have a complication section, and use more specific emotion labels than global evaluations. Narratives of Hogan’s (2003) juncture emotions anger and fear were expected to focus more on action and to contain more core narrative sections of orientation, complication, and resolution than narratives of the outcome emotions sadness and happiness. Hypotheses were confirmed for adults except for syntactic complexity, whereas children showed only some of these differences. Hogan’s theory that juncture emotions are restricted to the complication section was not confirmed. Finally, in adults, indirect speech was more frequent in anger narratives and internal monologue in fear narratives. It is concluded that different emotions should be studied in how they are narrated, and that narratives should be analyzed according to qualitatively different emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Three experiments investigated the interpersonal effects of anger and happiness in negotiations. In the course of a computer-mediated negotiation, participants received information about the emotional state (anger, happiness, or none) of their opponent. Consistent with a strategic-choice perspective, Experiment 1 showed that participants conceded more to an angry opponent than to a happy one. Experiment 2 showed that this effect was caused by tracking--participants used the emotion information to infer the other's limit, and they adjusted their demands accordingly. However, this effect was absent when the other made large concessions. Experiment 3 examined the interplay between experienced and communicated emotion and showed that angry communications (unlike happy ones) induced fear and thereby mitigated the effect of the opponent's experienced emotion. These results suggest that negotiators are especially influenced by their opponent's emotions when they are motivated and able to consider them. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
This experiment examined the effects of two discrete negative emotions, fear and anger, on selective attention. A within-subjects design was used, and all participants (N = 98) experienced the control, anger, and fear conditions. During each condition, participants viewed a film clip eliciting the target emotion and subsequently completed a flanker task and emotion report. Selective attention costs were assessed by comparing reaction times (RTs) on congruent (baseline) trials with RTs on incongruent trials. There was a significant interaction between emotion condition (control, anger, fear) and flanker type (congruent, incongruent). Contrasts further revealed a significant interaction between emotion and flanker type when comparing RTs in the control and fear conditions, and a marginally significant interaction when comparing RTs in the control and anger conditions. This indicates that selective attention costs were significantly lower in the fear compared to the control condition and were marginally lower in the anger compared with the control condition. Further analysis of participants reporting heightened anger in the anger condition revealed significantly lower selective attention costs during anger compared to a control state. These findings support the general prediction that high arousal negative emotional states inhibit processing of nontarget information and enhance selective attention. This study is the first to show an enhancing effect of anger on selective attention. It also offers convergent evidence to studies that have previously shown an influence of fear on attentional focus using the global-local paradigm. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

15.
Two studies examined the hypothesis that geometric patterns in the facial expressions of anger and happiness provide information that permits observers to recognize the meaning of threat and warmth. A 1st study sought to isolate the configural properties by examining whether large-scale body movements encode affect-related meanings in similar ways. Results indicated that diagonal and angular body patterns convey threat, whereas round body patterns convey warmth. In a 2nd study, a set of 3 experiments using models of simple geometric patterns revealed that acute angles with downward pointing vertices conveyed the meaning of threat and that roundedness conveyed the meaning of warmth. Human facial features exhibit these same geometric properties in displays of anger and happiness. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
4-yr-olds viewed slides of children designed to elicit emotions of fear, sadness, anger, and happiness in the S. Cooperative behavior was then assessed by having Ss play a game in dyads and by observing their behavior in nursery school. No differences in total empathy or in individual emotions were obtained between cooperative and noncooperative children. Girls obtained higher empathy scores than boys. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Three experiments examined the impact of incidental emotions on implicit intergroup evaluations. Experiment 1 demonstrated that for unknown social groups, two negative emotions that are broadly applicable to intergroup conflict (anger and disgust) both created implicit bias where none had existed before. However, for known groups about which perceivers had prior knowledge, emotions increased implicit prejudice only if the induced emotion was applicable to the outgroup stereotype. Disgust increased bias against disgust-relevant groups (e.g., homosexuals) but anger did not (Experiment 2); anger increased bias against anger-relevant groups (e.g., Arabs) but disgust did not (Experiment 3). Consistent with functional theories of emotion, these findings suggest that negative intergroup emotions signal specific types of threat. If the emotion-specific threat is applicable to prior expectations of a group, the emotion ratchets up implicit prejudice toward that group. However, if the emotion-specific threat is not applicable to the target group, evaluations remain unchanged. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Investigated the degree to which 4–5 yr olds (n?=?48) can enact expressions of emotion recognizable by peers and adults; the study also examined whether accuracy of recognition was a function of age and whether the expression was posed or spontaneous. Adults (n?=?103) were much more accurate than children in recognizing neutral states, slightly more accurate in recognizing happiness and anger, and equally accurate in recognizing sadness. Children's spontaneous displays of happiness were more recognizable than posed displays, but for other emotions there was no difference between the recognizability of posed and spontaneous expressions. Children were highly accurate in identifying the facial expressions of happiness, sadness, and anger displayed by their peers. Sex and ethnicity of the child whose emotion was displayed interacted to influence only adults' recognizability of anger. Results are discussed in terms of the social learning and cognitive developmental factors influencing (a) adults' and children's decoding (recognition) of emotional expressions in young children and (b) encoding (posing) of emotional expressions by young children. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Studies of emotion signaling inform claims about the taxonomic structure, evolutionary origins, and physiological correlates of emotions. Emotion vocalization research has tended to focus on a limited set of emotions: anger, disgust, fear, sadness, surprise, happiness, and for the voice, also tenderness. Here, we examine how well brief vocal bursts can communicate 22 different emotions: 9 negative (Study 1) and 13 positive (Study 2), and whether prototypical vocal bursts convey emotions more reliably than heterogeneous vocal bursts (Study 3). Results show that vocal bursts communicate emotions like anger, fear, and sadness, as well as seldom-studied states like awe, compassion, interest, and embarrassment. Ancillary analyses reveal family-wise patterns of vocal burst expression. Errors in classification were more common within emotion families (e.g., ’self-conscious,’ ’pro-social’) than between emotion families. The three studies reported highlight the voice as a rich modality for emotion display that can inform fundamental constructs about emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号