首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The authors examined whether facial expressions of emotion would predict changes in heart function. One hundred fifteen male patients with coronary artery disease underwent the Type A Structured Interview, during which time measures of transient myocardial ischemia (wall motion abnormality and left ventricular ejection fraction) were obtained. Facial behavior exhibited during the ischemia measurement period was videotaped and later coded by using the Facial Action Coding System (P. Ekman & W. V. Friesen, 1978). Those participants who exhibited ischemia showed significantly more anger expressions and nonenjoyment smiles than nonischemics. Cook–Medley Hostility scores did not vary with ischemic status. The findings have implications for understanding how anger and hostility differentially influence coronary heart disease risk. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Twenty abused and 20 nonabused pairs of children (3 to 7 years of age) and their mothers participated in a facial expression posing task and a facial expression recognition task. The expressions produced by subjects were judged on emotion content by naive raters and were coded using Friesen and Ekman's (1984) Emotion Facial Action Coding System (EMFACS). Data analysis indicated that abused children and their mothers pose less recognizable expressions than nonabused children and mothers. Although abused children were less accurate than nonabused children in recognizing emotional expressions, there was no difference in recognition accuracy between the two groups of mothers. A significant correlation between mothers' posing scores and children's recognition scores was also obtained. These results suggest that abused children may not observe easily interpreted voluntary displays of emotion by their mothers as often as nonabused children. This may partially explain the difference in recognition (and production) abilities of abused and nonabused children. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
The common assumption that emotional expression mediates the course of bereavement is tested. Competing hypotheses about the direction of mediation were formulated from the grief work and social-functional accounts of emotional expression. Facial expressions of emotion in conjugally bereaved adults were coded at 6 months post-loss as they described their relationship with the deceased; grief and perceived health were measured at 6, 14, and 25 months. Facial expressions of negative emotion, in particular anger, predicted increased grief at 14 months and poorer perceived health through 25 months. Facial expressions of positive emotion predicted decreased grief through 25 months and a positive but nonsignificant relation to perceived health. Predictive relations between negative and positive emotional expression persisted when initial levels of self-reported emotion, grief, and health were statistically controlled, demonstrating the mediating role of facial expressions of emotion in adjustment to conjugal loss. Theoretical and clinical implications are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Five studies investigated the young infant's ability to produce identifiable emotion expressions as defined in differential emotions theory. Trained judges applied emotion-specific criteria in selecting expression stimuli from videotape recordings of 54 1–9 mo old infants' responses to a variety of incentive events, ranging from playful interactions to the pain of inoculations. Four samples of untrained Ss (130 undergraduates and 62 female health service professionals) confirmed the social validity of infants' emotion expressions by reliably identifying expressions of interest, joy, surprise, sadness, anger, disgust, contempt, and fear. Brief training resulted in significant increases in the accuracy of discrimination of infants' negative emotion expressions for low-accuracy Ss. Construct validity for the 8 emotion expressions identified by untrained Ss and for a consistent pattern of facial responses to unanticipated pain was provided by expression identifications derived from an objective, theoretically structured, anatomically based facial movement coding system. (21 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
We report two studies validating a new standardized set of filmed emotion expressions, the Amsterdam Dynamic Facial Expression Set (ADFES). The ADFES is distinct from existing datasets in that it includes a face-forward version and two different head-turning versions (faces turning toward and away from viewers), North-European as well as Mediterranean models (male and female), and nine discrete emotions (joy, anger, fear, sadness, surprise, disgust, contempt, pride, and embarrassment). Study 1 showed that the ADFES received excellent recognition scores. Recognition was affected by social categorization of the model: displays of North-European models were better recognized by Dutch participants, suggesting an ingroup advantage. Head-turning did not affect recognition accuracy. Study 2 showed that participants more strongly perceived themselves to be the cause of the other's emotion when the model's face turned toward the respondents. The ADFES provides new avenues for research on emotion expression and is available for researchers upon request. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

7.
Facial expression and emotional stimuli were varied orthogonally in a 3?×?4 factorial design to test whether facial expression is necessary or sufficient to influence emotional experience. 123 undergraduates watched a film eliciting fear, sadness, or no emotion while holding their facial muscles in the position characteristic of fear or sadness or in an effortful but nonemotional grimace; those in a 4th group received no facial instructions. The Ss believed that the study concerned subliminal perception and that the facial positions were necessary to prevent physiological recording artifacts. The films had powerful effects on reported emotions, the facial expressions none. Correlations between facial expression and reported emotion were zero. Sad and fearful Ss showed distinctive patterns of physiological arousal. Facial expression also tended to affect physiological responses in a manner consistent with an effort hypothesis. (33 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
OBJECTIVE: We examined the relationship between hostility and mononuclear leukocyte (MNL) beta-adrenergic receptor function in a sample of young healthy males. METHOD: Thirty subjects were selected for having scored above 20 (N = 11) and below 14 (N = 19) on the Cook-Medley Hostility (Ho) scale. MNL beta-adrenergic receptor function was characterized in terms of receptor density (Bmax) and ligand-binding affinity (Kd) in homogenized cells, and intracellular cyclic adenosine monophosphate (cAMP) responses to saline, isoproterenol, and forskolin in whole cells. Subjects also completed the Multidimensional Anger Inventory (MAI), which assesses dimensions of anger. RESULTS: Relative to men with low Ho scores, men with Ho scores above 20 showed lower receptor number and greater forskolin-stimulated cAMP. Moreover, high hostile men reported a greater frequency of anger, longer duration of anger, more frequent brooding, and a hostile outlook. CONCLUSIONS: These data indicate that adrenergic receptor down-regulation is associated with hostility. This association may be linked to hostile persons' propensity for excessive and prolonged neuroendocrine responses to either psychological stressors or the experience of chronic stress associated with frequent and prolonged bouts of anger.  相似文献   

9.
Facial expressions are crucial to human social communication, but the extent to which they are innate and universal versus learned and culture dependent is a subject of debate. Two studies explored the effect of culture and learning on facial expression understanding. In Experiment 1, Japanese and U.S. participants interpreted facial expressions of emotion. Each group was better than the other at classifying facial expressions posed by members of the same culture. In Experiment 2, this reciprocal in-group advantage was reproduced by a neurocomputational model trained in either a Japanese cultural context or an American cultural context. The model demonstrates how each of us, interacting with others in a particular cultural context, learns to recognize a culture-specific facial expression dialect. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
The facial expressions of 28 13-mo-old middle-class children were videotaped during the 3-min separation episode of the Ainsworth strange-situation procedure (ASSP). Facial behavior was analyzed to determine the patterns of emotional expressions during separation and to assess the relations between these patterns and types of attachment as assessed by the ASSP. Findings reveal that anger was the dominant negative emotion expressed by the majority of Ss in each of 3 ad hoc groups determined by level of negative emotion. Some high-negative emotion expressers displayed predominantly anger and others mainly sadness. Patterns of emotion expression varied with type of attachment; Ss who showed an insecure-resistant attachment pattern displayed less interest and more sadness than Ss in the securely attached groups. The proportion of time anger was expressed did not differ significantly with type of attachment. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

12.
16 children were videotaped at 13 and 18 mo of age during the strange-situation procedure (M. D. Ainsworth et al, 1978). Facial expressions (interest, anger, sadness, and emotion blends) during the 2nd separation episode were coded using a system for identifying affect expressions by holistic judgments (Affex) developed by the 2nd author and colleagues (1980). Results show significant continuities in proportion of interest expressions, anger, emotion blends and frequency of expression changes. The major developmental change was seen in an age?×?emotion interaction, showing an increase in the use of facial expression blends or combinations from 13 to 18 mo. Results support the belief that patterns of emotion reflect early, persistent individual differences; they also reflect a developmental trend toward increasing complexity of emotional responses. (16 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Describes and compares normative emotional responses in solitary play, and determines developmental changes associated with expression of emotion in play. A developmental perspective on emotions is described. The results from 3 studies that examined the expression of emotion (facial expressions) during infants' and children's (aged 6 mo–5 yrs) solitary play are discussed as a foundation from which to consider the functions of emotion in play therapy. Facial expressions of emotions were assessed using the System for Identifying Affect Expressions by Holistic Judgments (C. E. Izard et al, 1983). Exploration and play were measured using a standardized scale developed by J. Belsy and R. K. Most (1981). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Reports an error in "Facial expressions of emotion influence memory for facial identity in an automatic way" by Arnaud D'Argembeau and Martial Van der Linden (Emotion, 2007[Aug], Vol 7[3], 507-515). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum. (The following abstract of the original article appeared in record 2007-11660-005.) Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

16.
The facial expressions of adults with Down's syndrome (DS; n?=?15) as they watched happy, sad, and neutral videotapes were compared with those of a healthy age-matched control group (n?=?20). Facial movements were analyzed with the Facial Action Coding System (P. E. Ekman & W. V. Friesen, 1978). While watching happy stimuli, the 10 DS adults who were able to appropriately rate their reactions smiled with a cheek raise as frequently as control adults, suggesting that the expression of positive affect in these individuals is normal. Contrary to predictions, however, the DS group exhibited fewer smiles without cheek raises than did control adults and were more likely not to smile. Neither group showed prototypic sad facial expressions in response to sad stimuli. Independent of emotion, DS participants made more facial movements, including more tongue shows, than did control participants. Differences in facial expression in DS adults may confuse others' interpretations of their emotional responses and may be important for understanding the development of abnormal emotional processes. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Three studies assessed whether the combined traits of hostility and defensiveness identify a group of hostile individuals with functionally severe coronary artery disease (CAD). CAD patients completed the Cook-Medley Hostility Inventory (Ho) and the Marlowe-Crowne Social Desirability Scale (MC). Patients were classified into 4 groups: defensive hostile (DH: high Ho, high MC), low hostile (LH: low Ho, low MC), high hostile (HH: high Ho, low MC), and defensive (Def: low Ho, high MC). DH in comparison to HH, LH, and Def CAD patients demonstrate the greatest perfusion defects as measured by exercise thallium scintigraphy; DH patients exhibit the most frequent ischemic episodes during ambulatory electrocardiographic monitoring; and in a laboratory study, DH patients exhibit the most severe mental stress-induced ischemia assessed by echocardiography. Thus, the combination of high hostility and high defensiveness are associated with more functionally severe CAD and may predispose CAD patients to a more adverse prognosis. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
The Chimpanzee Facial Action Coding System (ChimpFACS) is an objective, standardized observational tool for measuring facial movement in chimpanzees based on the well-known human Facial Action Coding System (FACS; P. Ekman & W. V. Friesen, 1978). This tool enables direct structural comparisons of facial expressions between humans and chimpanzees in terms of their common underlying musculature. Here the authors provide data on the first application of the ChimpFACS to validate existing categories of chimpanzee facial expressions using discriminant functions analyses. The ChimpFACS validated most existing expression categories (6 of 9) and, where the predicted group memberships were poor, the authors discuss potential problems with ChimpFACS and/or existing categorizations. The authors also report the prototypical movement configurations associated with these 6 expression categories. For all expressions, unique combinations of muscle movements were identified, and these are illustrated as peak intensity prototypical expression configurations. Finally, the authors suggest a potential homology between these prototypical chimpanzee expressions and human expressions based on structural similarities. These results contribute to our understanding of the evolution of emotional communication by suggesting several structural homologies between the facial expressions of chimpanzees and humans and facilitating future research. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Behavioral differences may clarify the link between hostility and health. This study examined facial expression. Seventy-two low- and high-hostile undergraduates underwent the Type A Structured Interview (SI) and a test of social anxiety. Facial behavior was measured with the Facial Action Coding System. Low-hostile participants displayed non-Duchenne smiles more frequently than high-hostile participants during the SI. There were no group differences in the expression of disgust. The results identify differences in the nonverbal behavior of hostile people. Restricted use of non-Duchenne smiles may reflect limited use of appeasement, contributing to uncomfortable interpersonal relations and limited social support. The findings are consistent with a behavioral ecology perspective and suggest that social regulation may be as important as negative affect in determining the consequences of hostility. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号