首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 234 毫秒
1.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Five studies investigated the young infant's ability to produce identifiable emotion expressions as defined in differential emotions theory. Trained judges applied emotion-specific criteria in selecting expression stimuli from videotape recordings of 54 1–9 mo old infants' responses to a variety of incentive events, ranging from playful interactions to the pain of inoculations. Four samples of untrained Ss (130 undergraduates and 62 female health service professionals) confirmed the social validity of infants' emotion expressions by reliably identifying expressions of interest, joy, surprise, sadness, anger, disgust, contempt, and fear. Brief training resulted in significant increases in the accuracy of discrimination of infants' negative emotion expressions for low-accuracy Ss. Construct validity for the 8 emotion expressions identified by untrained Ss and for a consistent pattern of facial responses to unanticipated pain was provided by expression identifications derived from an objective, theoretically structured, anatomically based facial movement coding system. (21 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
In this set of studies, we examine the perceptual similarities between emotions that share either a valence or a motivational direction. Determination is a positive approach-related emotion, whereas anger is a negative approach-related emotion. Thus, determination and anger share a motivational direction but are opposite in valence. An implemental mind-set has previously been shown to produce high-approach-motivated positive affect. Thus, in Study 1, participants were asked to freely report the strongest emotion they experienced during an implemental mind-set. The most common emotion reported was determination. On the basis of this result, we compared the facial expression of determination with that of anger. In Study 2, naive judges were asked to identify photographs of facial expressions intended to express determination, along with photographs intended to express basic emotions (joy, anger, sadness, fear, disgust, neutral). Correct identifications of intended determination expressions were correlated with misidentifications of the expressions as anger but not with misidentifications as any other emotion. This suggests that determination, a high-approach-motivated positive affect, is perceived as similar to anger. In Study 3, naive judges quantified the intensity of joy, anger, and determination expressed in photographs. The intensity of perceived determination was directly correlated with the intensity of perceived anger (a high-approach-motivated negative affect) and was inversely correlated with the intensity of perceived joy (a low-approach-motivated positive affect). These results demonstrate perceptual similarity between emotions that share a motivational direction but differ in valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Gaze direction influences younger adults' perception of emotional expressions, with direct gaze enhancing the perception of anger and joy, while averted gaze enhances the perception of fear. Age-related declines in emotion recognition and eye-gaze processing have been reported, indicating that there may be age-related changes in the ability to integrate these facial cues. As there is evidence of a positivity bias with age, age-related difficulties integrating these cues may be greatest for negative emotions. The present research investigated age differences in the extent to which gaze direction influenced explicit perception (e.g., anger, fear and joy; Study 1) and social judgments (e.g., of approachability; Study 2) of emotion faces. Gaze direction did not influence the perception of fear in either age group. In both studies, age differences were found in the extent to which gaze direction influenced judgments of angry and joyful faces, with older adults showing less integration of gaze and emotion cues than younger adults. Age differences were greatest when interpreting angry expressions. Implications of these findings for older adults' social functioning are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Studied differences between dimensional and categorical judgments of static and dynamic spontaneous facial expressions of emotion. In the 1st part of the study, 25 university students presented with either static or dynamic facial expressions of emotions (i.e., joy, fear, anger, surprise, disgust, and sadness) and asked to evaluate the similarity of 21 pairs of stimuli on a 7-point scale. Results were analyzed using a multidimensional scaling procedure. In the 2nd part of the study, Ss were asked to categorize the expressed emotions according to their intensity. Differences in the categorization of static and dynamic stimuli were analyzed. Results from the similarity rating task and the categorization task were compared. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
40 undergraduates viewed videotaped excerpts of happiness/reassurance, anger/threat, and fear/evasion expressive displays by US President Ronald Reagan. Within each display condition 1 excerpt was presented in image-only and 1 in sound-plus-image format. Emotional reactions were assessed by facial electromyography (EMG) from the brow and cheek regions, skin resistance, and heart rate. Following each excerpt, Ss also verbally reported the intensity of 8 emotions, including joy, interest, anger, and fear. Findings indicate that self-reported emotions were influenced strongly by both the expressive displays and prior attitude toward Reagan and by media condition. Facial EMG indicated smiling during happiness/reassurance displays and frowning during anger/threat and fear/evasion displays, especially during image-only presentations. Display effects were also found for skin resistance responses when the media conditions were combined and for heart rate changes in the sound-plus-image condition. Results indicate that expressive displays had a direct emotional impact on viewers and that prior attitudes influenced retrospective self-reports of emotion. (46 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
This study examined facial expressions in relation to cognition in infants aged 2–8 months. Eighty infants, divided equally among 4 age groups (2, 4, 6, and 8 months) participated. Forty-eight Ss received an audiovisual stimulus contingent on arm movement, and 32 infants did not control the stimulus. Infant facial expressions during learning and extinction were coded using the Maximally Discriminative Facial Movement Coding System (MAX). Infants in the contingent group expressed greater interest and joy during learning and greater anger during extinction. There was a high concordance between arm pulling and the expression of anger during extinction, indicating that a brief exposure to extinction produces frustration-like changes in emotional responsivity. Individual differences existed in infant responses to frustration during extinction. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Two studies provided direct support for a recently proposed dialect theory of communicating emotion, positing that expressive displays show cultural variations similar to linguistic dialects, thereby decreasing accurate recognition by out-group members. In Study 1, 60 participants from Quebec and Gabon posed facial expressions. Dialects, in the form of activating different muscles for the same expressions, emerged most clearly for serenity, shame, and contempt and also for anger, sadness, surprise, and happiness, but not for fear, disgust, or embarrassment. In Study 2, Quebecois and Gabonese participants judged these stimuli and stimuli standardized to erase cultural dialects. As predicted, an in-group advantage emerged for nonstandardized expressions only and most strongly for expressions with greater regional dialects, according to Study 1. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Four studies examined aspects of the differential emotions theory (DET) hypothesis of expressive behavior development. In Study 1, facial-expressive movements of 108 2.5–9-mo-old infants were video recorded in positive and negative mother–infant interactions (conditions). As expected, Max-specified full-face and partial expressions of interest, joy, sadness, and anger were morphologically stable between the 2 ages. Studies 1 and 2 confirmed predicted differential responding to mother sadness and anger expressions and to composite positive and negative conditions. Discrete negative expressions exceeded negative blends, and the amount of both expression types remained stable across ages. Studies 3 and 4 provided varying degrees of support for the social validity of Max-specified infant negative affect expressions. Conclusions include revisions and clarifications of DET. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
The current series of studies provide converging evidence that facial expressions of fear and anger may have co-evolved to mimic mature and babyish faces in order to enhance their communicative signal. In Studies 1 and 2, fearful and angry facial expressions were manipulated to have enhanced babyish features (larger eyes) or enhanced mature features (smaller eyes) and in the context of a speeded categorization task in Study 1 and a visual noise paradigm in Study 2, results indicated that larger eyes facilitated the recognition of fearful facial expressions, while smaller eyes facilitated the recognition of angry facial expressions. Study 3 manipulated facial roundness, a stable structure that does not vary systematically with expressions, and found that congruency between maturity and expression (narrow face-anger; round face-fear) facilitated expression recognition accuracy. Results are discussed as representing a broad co-evolutionary relationship between facial maturity and fearful and angry facial expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

14.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

15.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
This study examined electroencephalogram (EEG) asymmetries during the presence of discrete facial signs of emotion. Thirty-five 10-month-old infants were tested in a standard stranger- and mother-approach paradigm that included a brief separation from their mother. Infant facial expression was videotaped, and brain electrical activity from left and right frontal and parietal regions was recorded. The videotapes were coded with two different discrete facial coding systems. Artifact-free periods of EEG were extracted that were coincident with the expression of the emotions of joy, anger, and sadness. The data revealed different patterns of EEG asymmetry depending on the type of facial expression and vocal expression of affect that was observed. Expressions of joy that involved facial actions of both zygomatic and orbicularis oculi were seen more often in response to mother approach, whereas smiles that did not involve the action of orbicularis oculi were seen more often in response to approach of the stranger. The former type of smile was associated with relative left frontal activation, whereas the latter type was associated with right frontal activation. Facial expressions of anger and sadness exhibited in the absence of crying were associated with left frontal activation, whereas these same facial expressions during crying were associated with right frontal activation. These data underscore the usefulness of EEG measures of hemispheric activation in differentiating among emotional states associated with differences in facial and vocalic expressivity. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Previous studies have demonstrated that 1-yr-old infants look toward their mothers' facial expressions and use the emotional information conveyed. In this study, 46 1-yr-olds were confronted with an unusual toy in a context where an experimenter familiar to the infants posed either happy or fearful expressions and where their mothers were present but did not provide facial signals. Results indicate that most of the Ss (83%) referenced the familiarized stranger. Once the adult's facial signals were noted, the S's instrumental behaviors and expressive responses to the toy were influenced in the direction of the affective valence of the adult's expression. It is suggested that infants may be influenced by the emotional expressions of a much broader group of adults than has previously been recognized. (22 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Preschool children, 2 to 5 years of age, and adults posed the six facial expressions of happiness, surprise, anger, fear, sadness, and disgust before a videotape camera. Their poses were scored subsequently using the MAX system. The number of poses that included all components of the target expression (complete expressions) as well as the frequency of those that included only some of the components of the target expressions (partial expressions) were analyzed. Results indicated that 2-year-olds as a group failed to pose any face. Three-year-olds were a transitional group, posing happiness and surprise expressions but none of the remaining faces to any degree. Four- and 5-year-olds were similar to one another and differed from adults only on surprise and anger expressions. Adults were able to pose both these expressions. No group, including adults, posed fear and disgust well. Posing of happiness showed no change after 3 years of age. Consistent differences between partial and complete poses were observed particularly for the negative expressions of sadness, fear, and disgust. Implications of these results for socialization theories of emotion are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
A total of 74 Ss were induced to adopt expressions of fear, anger, disgust, and sadness in Experiment 1. Each expression significantly increased feelings of its particular emotion compared with at least two of the others, a result that cannot be explained by a single dimension. Postures should play the same role in emotional experience as facial expressions. However, the demonstrated effects of postures (Riskind, 1984) could also represent a single dimension of variation. In Experiment 2, subjects were induced to adopt postures characteristic of fear, anger, and sadness. Again, the effects were specific to the postures. These two studies indicate that emotional behavior produces changes in feelings that specifically match the behavior. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Potentiation of startle has been demonstrated in experimentally produced aversive emotional states, and clinical reports suggest that potentiated startle may be associated with fear or anxiety. To test the generalizability of startle potentiation across a variety of emotional states as well as its sensitivity to individual differences in fearfulness, the acoustic startle response of 17 high- and 15 low-fear adult Ss was assessed during fear, anger, joy, sadness, pleasant relaxation, and neutral imagery. Startle responses were larger in all aversive affective states than during pleasant imagery. This effect was enhanced among high fear Ss, although follow-up testing indicated that other affective individual differences (depression and anger) may also be related to increased potentiation of startle in negative affect. Startle latency was reduced during high- rather than low-arousal imagery but was unaffected by emotional valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号