首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Examined whether spontaneous facial expressions provide observers with sufficient information to distinguish accurately which of 7 affective states (6 emotional and 1 neutral) is being experienced by another person. Six undergraduate senders' facial expressions were covertly videotaped as they watched emotionally loaded slides. After each slide, senders nominated the emotions term that best described their affective reaction and also rated the pleasantness and strength of that reaction. Similar nominations of emotion terms and ratings were later made by 53 undergraduate receivers who viewed the senders' videotaped facial expression. The central measure of communication accuracy was the match between senders' and receivers' emotion nominations. Overall accuracy was significantly greater than chance, although it was not impressive in absolute terms. Only happy, angry, and disgusted expressions were recognized at above-chance rates, whereas surprised expressions were recognized at rates that were significantly worse than chance. Female Ss were significantly better senders than were male Ss. Although neither sex was found to be better at receiving facial expressions, female Ss were better receivers of female senders' expressions than of male senders' expressions. Female senders' neutral and surprised expressions were more accurately recognized than were those of male senders. The only sex difference found for decoding emotions was a tendency for male Ss to be more accurate at recognizing anger. (25 ref) (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

2.
50 undergraduate receivers viewed videotapes of the facial/gestural responses of male and female senders to emotionally-loaded slides and indicated via an event recorder when they felt that meaningful events occurred. Substantial agreement on the location of meaningful events was demonstrated, with strikingly different unitization patterns emerging with female vs male senders; females showed more overall nonverbal and meaningful behaviors, and meaningful behavior was more likely to involve facial expressions among females than males. Receivers with previous experience with the videotape marked more points as meaningful than did less experienced receivers, and measures of the quality of segmentation were related to receiving ability among female receivers but not males. Events in the slide period, when the sender viewed the slide silently, were related to the sending accuracy of the sequence, whereas events in the talk period, when the sender described his/her response to the slide, were not. This was the case even though more activity and facial expressions occurred during the talk period. (35 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Videotapes of spontaneous facial and gestural reactions to affective slides were segmented by university student observers using a group adaptation of D. Newtson's (1976) unitization technique. In Exp I, 46 females and 35 males segmented the expressions of a variety of children and adults; in Exp II, 50 males segmented the expressions of high vs low-expressive male and female senders. Results demonstrate that the unitization technique applied to emotion expression yields reliable patterns of segmentation that may be used to investigate the relationships of communication accuracy to both the nature of the sender's behavior and the attentional patterns of the receiver. In particular, differences in segmentation results for males and females suggest that the technique may allow the detailed examination of process-level gender differences in nonverbal sending accuracy and receiving ability. (13 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
The relations of parents' warmth, emotional expressivity, and discussion of emotion to 2nd–5th graders' regulation of emotional expressivity, externalizing problem behaviors, and expressivity were examined. Parents' and children's facial expressions to evocative slides were observed, as was parents' discussion of the slides, and parents and teachers provided information on children's regulation of expressivity and problem behavior. Analyses supported the hypothesis that the effect of parental variables on children's problem behavior was at least partly indirect through their children's regulation of emotion. Children's low negative (versus positive) facial expressivity to negative slides was associated with problem behavior for boys. A reversed model did not support the possibility that children's functioning had causal effects on parenting. The findings suggest that parents' emotion-related behaviors are linked to children's regulation of expressivity and externalizing behaviors. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Studied the effect of maternal facial expressions of emotion on 108 12-mo-old infants in 4 studies. The deep side of a visual cliff was adjusted to a height that produced no clear avoidance and much referencing of the mother. In Study 1, 19 Ss viewed a facial expression of joy, while 17 Ss viewed one of fear. In Study 2, 15 Ss viewed interest, while 18 Ss viewed anger. In Study 3, 19 Ss viewed sadness. In Study 4, 23 Ss were used to determine whether the expressions influenced Ss' evaluation of an ambiguous situation or whether they were effective in controlling behavior merely because of their discrepancy or unexpectedness. Results show that Ss used facial expressions to disambiguate situations. If a mother posed joy or interest while S referenced, most Ss crossed the deep side. If a mother posed fear or anger, few Ss crossed. In the absence of any depth whatsoever, few Ss referenced the mother and those who did, while the mother was posing fear, hesitated but crossed nonetheless. It is suggested that facial expressions regulate behavior most clearly in contexts of uncertainty. (17 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Studied the extent to which communication channel affects judgments of the type and authenticity of emotions. 80 university students (mean age 21.5 yrs) were presented with short audio, video, and audiovideo excerpts of actors expressing specific emotions. In some cases, the emotion was actually experienced by the actor; in other cases, the emotion was simulated. Ss were distributed over 8 communication channel conditions (i.e., facial, audio, filtered audio, gestural?+?facial, facial?+?filtered audio, facial?+?audio, gestural?+?facial?+?filtered audio, and gestural?+?facial?+?audio) and asked to judge the emotional category (i.e., happiness, fear, anger, surprise, and sadness) and the authenticity of the emotion. The accuracy of the judgments was analyzed in relation to the type of channel and the type of emotion. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
The contributions to the recognition of emotional signals of (a) experience and learning versus (b) internal predispositions are difficult to investigate because children are virtually always exposed to complex emotional experiences from birth. The recognition of emotion among physically abused and physically neglected preschoolers was assessed in order to examine the effects of atypical experience on emotional development. In Experiment 1, children matched a facial expression to an emotional situation. Neglected children had more difficulty discriminating emotional expressions than did control or physically abused children. Physically abused children displayed a response bias for angry facial expressions. In Experiment 2, children rated the similarity of facial expressions. Control children viewed discrete emotions as dissimilar, neglected children saw fewer distinctions between emotions, and physically abused children showed the most variance across emotions. These results suggest that to the extent that children's experience with the world varies, so too will their interpretation and understanding of emotional signals. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
It has been proposed that self-face representations are involved in interpreting facial emotions of others. We experimentally primed participants' self-face representations. In Study 1, we assessed eye tracking patterns and performance on a facial emotion discrimination task, and in Study 2, we assessed emotion ratings between self and nonself groups. Results show that experimental priming of self-face representations increases visual exploration of faces, facilitates the speed of facial expression processing, and increases the emotional distance between expressions. These findings suggest that the ability to interpret facial expressions of others is intimately associated with the representations we have of our own faces. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

10.
Dominance and submission constitute fundamentally different social interaction strategies that may be enacted most effectively to the extent that the emotions of others are relatively ignored (dominance) versus noticed (submission). On the basis of such considerations, we hypothesized a systematic relationship between chronic tendencies toward high versus low levels of interpersonal dominance and emotion decoding accuracy in objective tasks. In two studies (total N = 232), interpersonally dominant individuals exhibited poorer levels of emotion recognition in response to audio and video clips (Study 1) and facial expressions of emotion (Study 2). The results provide a novel perspective on interpersonal dominance, suggest its strategic nature (Study 2), and are discussed in relation to Fiske's (1993) social–cognitive theory of power. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

11.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

13.
In a pilot study, 16 undergraduates were exposed briefly to slides and tones that were mildly to moderately evocative of positive and negative affect. Facial electromyograph (EMG) activity differentiated both the valence and intensity of the affective reaction. Moreover, 8 independent judges were unable to determine from viewing videotapes of the Ss' facial displays whether a positive or negative stimulus had been presented or whether a mildly or moderately intense stimulus had been presented. In the main experiment, 28 Ss briefly viewed slides of scenes that were mildly to moderately evocative of positive and negative affect. Again, EMGH activity over the brow, eye, and cheek muscle regions differentiated the pleasantness and intensity of Ss' affective reactions to the visual stimuli even though visual inspection of the videotapes again indicated that expressions of emotion were not apparent. Results suggest that gradients of EMGH activity over the muscles of facial expression can provide objective and continuous probes of affective processes that are too subtle or fleeting to evoke expressions observable under normal conditions of social interaction. (76 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Two studies assessed recognition memory of interpersonal traits that subjects had rated according to either private self-reference (Study 1) or public self-reference (Study 2). Both studies also administered the Self-Consciousness Scale, which permitted a dual classification of subjects according to private self-consciousness (high and low) and public self-consciousness (high and low). Study 1 revealed a private false alarms effect (FAE), the strength of which was moderated by private self-consciousness, whereas Study 2, revealed a public FAE, the strength of which was moderated by public self-consciousness. From the convergent and discriminant evidence, two hypotheses received support—namely, that (a) individuals articulate both private and public components of the self-schema, and (b) private self-consciousness predicts the extent to which individuals articulate the private component, whereas public self-consciousness predicts the extent to which individuals articulate the public component. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

16.
Amygdala damage can result in impairments in evaluating facial expressions largely specific to fear. In contrast, right-hemisphere cortical lesions result in a more global deficit in facial emotion evaluation. This study addressed these 2 contrasting findings by investigating amygdala and adjacent cortical contributions to the evaluation of facial emotion in 12 patients with right and 11 patients with left unilateral anteromedial temporal lobectomy (RTL and LTL, respectively) and 23 normal controls. RTL but not LTL patients revealed impaired intensity ratings that included but were not exclusive to fear, with the most severe deficits confined to expressions related to affective states of withdrawal-avoidance. This suggests that affective hemispheric specializations in cortical function may extend to subcortical limbic regions. In addition, the right amygdala and adjacent cortex may be part of a neural circuit representing facial expressions of withdrawal. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
In 4 experiments, 48 normal and 48 emotionally maladjusted boys in 2 age groups (7–8 and 10–11 yrs) were questioned about the link between emotion and memory, using facial drawings depicting emotional expressions. Ss in all 4 groups knew (a) that emotion gradually declines in intensity once the episode provoking the emotion is over, (b) that variation between people in the intensity of their emotional reaction to an episode will persist despite any decline in intensity over time, and (c) that an episode will be more or less memorable depending on whether or not it arouses emotion. The relative sophistication of Ss' knowledge about links between emotion and memory is contrasted with their ignorance regarding the voluntary control strategies that can be brought to bear on the display and experience of emotion. (11 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion–identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Two experiments were conducted in which participants looked at photographs (Experiment 1, n?=?129) or slides (Experiment 2, n?=?90) of people engaging in positive or negative facial expressions. Participants attempted to communicate these facial expressions as accurately as they could to a video camera while viewing themselves in a mirror or without viewing themselves in a mirror. Participants in a control group maintained neutral facial expressions. Participants experienced increased positive moods when they engaged in positive facial expressions and decreased positive moods when they engaged in negative facial expressions. These effects were enhanced when participants viewed themselves in a mirror. The effects of facial expressions on positive affect were stronger for participants with high private self-consciousness. Results were integrated with research identifying individuals who are responsive to self-produced versus situational cues and with theory and research on self-awareness. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Compared the affective responsiveness of dieters and nondieters. 47 male college students rated the emotional impact of projected slides in a situation similar to that used by P. Pliner et al (see record 1974-27296-001) with obese and normal Ss. The present findings show that dieters, like the obese, were more extreme emotional responders. When Ss were given an internal source of arousal (i.e., caffeine), nondieters became more emotional and dieters became less emotional. These results are discussed in terms of S. Schachter's (1971) "externality" model of obesity and S. Schachter and J. E. Singer's "external–internal" theory of emotion. (23 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号