首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Research has demonstrated that left-prefrontal cortical activity is associated with positive affect, or approach motivation, and that right-prefrontal cortical activity is associated with negative affect, or withdrawal motivation. In past research, emotional valence (positive–negative) has been confounded with motivational direction (approach–withdrawal), such that, for instance, the only emotions examined were both positive and approach related. Recent research has demonstrated that trait anger, a negative but approach-related emotion, is associated with increased left-prefrontal and decreased right-prefrontal activity, suggesting that prefrontal asymmetrical activity is associated with motivational direction and not emotional valence. The present experiment tested whether state-induced anger is associated with relative left-prefrontal activity and whether this prefrontal activity is also associated with aggression. Results supported these hypotheses. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
We examined relationships among individual differences in trait emotions and the emotion-modulated startle-eyeblink response. In particular, we examined the extent to which trait anger, which is negative in valence, would be associated with a pattern of approach motivation in startle eyeblink responses to appetitive stimuli. Self-reported trait emotions were compared with emotion-modulated startle eyeblink responses to auditory probes during appetitive, aversive, and neutral pictures. Results revealed that trait anger, enjoyment, and surprise were each associated with greater blink inhibition to appetitive pictures, indicating an approach motivational response. No other trait emotions were associated with startle eyeblink responses to appetitive or aversive pictures. These results support the idea that trait anger, although experienced as a negative emotion, is associated with an approach-related motivational response to appetitive stimuli at basic, reflexive levels of processing. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

4.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

5.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
Gaze direction influences younger adults' perception of emotional expressions, with direct gaze enhancing the perception of anger and joy, while averted gaze enhances the perception of fear. Age-related declines in emotion recognition and eye-gaze processing have been reported, indicating that there may be age-related changes in the ability to integrate these facial cues. As there is evidence of a positivity bias with age, age-related difficulties integrating these cues may be greatest for negative emotions. The present research investigated age differences in the extent to which gaze direction influenced explicit perception (e.g., anger, fear and joy; Study 1) and social judgments (e.g., of approachability; Study 2) of emotion faces. Gaze direction did not influence the perception of fear in either age group. In both studies, age differences were found in the extent to which gaze direction influenced judgments of angry and joyful faces, with older adults showing less integration of gaze and emotion cues than younger adults. Age differences were greatest when interpreting angry expressions. Implications of these findings for older adults' social functioning are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
The anterior regions of the left and right cerebral hemispheres have been posited to be specialized for expression and experience of approach and withdrawal processes, respectively. Much of the evidence supporting this hypothesis has been obtained by use of the anterior asymmetry in electroencephalographic alpha activity. In most of this research, however, motivational direction has been confounded with affective valence such that, for instance, approach motivation relates positively with positive affect. In the present research, we tested the hypothesis that dispositional anger, an approach-related motivational tendency with negative valence, would be associated with greater left- than right-anterior activity. Results supported the hypothesis, suggesting that the anterior asymmetry varies as a function of motivational direction rather than affective valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
We report two studies validating a new standardized set of filmed emotion expressions, the Amsterdam Dynamic Facial Expression Set (ADFES). The ADFES is distinct from existing datasets in that it includes a face-forward version and two different head-turning versions (faces turning toward and away from viewers), North-European as well as Mediterranean models (male and female), and nine discrete emotions (joy, anger, fear, sadness, surprise, disgust, contempt, pride, and embarrassment). Study 1 showed that the ADFES received excellent recognition scores. Recognition was affected by social categorization of the model: displays of North-European models were better recognized by Dutch participants, suggesting an ingroup advantage. Head-turning did not affect recognition accuracy. Study 2 showed that participants more strongly perceived themselves to be the cause of the other's emotion when the model's face turned toward the respondents. The ADFES provides new avenues for research on emotion expression and is available for researchers upon request. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

10.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Five studies investigated the young infant's ability to produce identifiable emotion expressions as defined in differential emotions theory. Trained judges applied emotion-specific criteria in selecting expression stimuli from videotape recordings of 54 1–9 mo old infants' responses to a variety of incentive events, ranging from playful interactions to the pain of inoculations. Four samples of untrained Ss (130 undergraduates and 62 female health service professionals) confirmed the social validity of infants' emotion expressions by reliably identifying expressions of interest, joy, surprise, sadness, anger, disgust, contempt, and fear. Brief training resulted in significant increases in the accuracy of discrimination of infants' negative emotion expressions for low-accuracy Ss. Construct validity for the 8 emotion expressions identified by untrained Ss and for a consistent pattern of facial responses to unanticipated pain was provided by expression identifications derived from an objective, theoretically structured, anatomically based facial movement coding system. (21 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Study 1 established either deliberative mind-set by having Ss contemplate personal change decision or implemental mind-set by having Ss plan execution of intended personal project. Ss were subsequently requested to continue beginnings of 3 fairy tales, each describing a main character with a decisional conflict. Analysis revealed that deliberative mind-set Ss ascribed more deliberative and less implementational efforts to main characters than implemental mind-set Ss. In Study 2, Ss were asked to choose between different test materials. Either before or after making their decision, Ss were given information on deliberative and implementational thoughts unrelated to their task at hand. When asked to recall these thoughts, predecisional Ss recalled more deliberative and less implementational thoughts, whereas for postdecisional Ss the reverse was true. These findings suggest that deliberative and implemental mind-sets tune thought production and information processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Four studies examined aspects of the differential emotions theory (DET) hypothesis of expressive behavior development. In Study 1, facial-expressive movements of 108 2.5–9-mo-old infants were video recorded in positive and negative mother–infant interactions (conditions). As expected, Max-specified full-face and partial expressions of interest, joy, sadness, and anger were morphologically stable between the 2 ages. Studies 1 and 2 confirmed predicted differential responding to mother sadness and anger expressions and to composite positive and negative conditions. Discrete negative expressions exceeded negative blends, and the amount of both expression types remained stable across ages. Studies 3 and 4 provided varying degrees of support for the social validity of Max-specified infant negative affect expressions. Conclusions include revisions and clarifications of DET. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
An emotional experience can last for only a couple of seconds up to several hours or even longer. In the present study, we examine to which extent covert intrapersonal actions (cognitions both related and unrelated to the emotion-eliciting stimulus) as well as overt interpersonal actions (social sharing) account for this variability in emotion duration. Participants were asked to report the duration of their anger, sadness, joy, and gratitude episodes on a daily basis during five days. Furthermore, information was collected with regard to their cognitions during the episodes and their social sharing behavior. Discrete-time survival analyses revealed that for three of the four emotions under study, stimulus-related cognitions with the same valence as the emotion lead to a prolongation of the episode; in contrast, both stimulus-related and stimulus-unrelated cognitions with a valence opposite to the emotion lead to a shortening. Finally, for the four emotions under study, social sharing was associated with a prolongation. The findings are discussed in terms of a possible process basis underlying the time dynamics of negative as well as positive emotions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

15.
Investigated the co-occurrence in experience of various emotions placing the focus on positive vs negative affect. In Study 1, 72 college students read stories designed to produce varying levels of either positive or negative affect and then rated their level of both types of affect. In Study 2, 42 undergraduates rated their feelings during emotional times in everyday life for a period of 6 wks. Results show that emotions of the same hedonic valence (e.g., fear and anger) tend to co-occur. Results also show that positive and negative affect do not occur together at high levels of intensity. It is suggested that these 2 facts about the relation of positive and negative affect are probably responsible for the bipolarity that is often found between them. These findings represent a challenge to those who postulate that there are unrelated discrete emotions and that the terms positive affect and negative affect do not describe meaningful clusters of emotions. Findings suggest that if one type of affect is of low intensity, the other type can be at any level from low to high. Therefore, a truly inverse and linear relation does not characterize positive and negative affect. This finding represents a challenge to most structural models of emotion. It appears that mutual exclusion only at high levels of intensity characterizes the relation between positive and negative affect. (2 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Emotional expressions influence social judgments of personality traits. The goal of the present research was to show that it is of interest to assess the impact of neutral expressions in this context. In 2 studies using different methodologies, the authors found that participants perceived men who expressed neutral and angry emotions as higher in dominance when compared with men expressing sadness or shame. Study 1 showed that this is also true for men expressing happiness. In contrast, women expressing either anger or happiness were perceived as higher in dominance than were women showing a neutral expression who were rated as less dominant. However, sadness expressions by both men and women clearly decreased the extent to which they were perceived as dominant, and a trend in this direction emerged for shame expressions by men in Study 2. Thus, neutral expressions seem to be perceived as a sign of dominance in men but not in women. The present findings extend our understanding of the way different emotional expressions affect perceived dominance and the signal function of neutral expressions—which in the past have often been ignored. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
[Correction Notice: An erratum for this article was reported in Vol 9(4) of Emotion (see record 2009-11528-009). In this article a symbol was incorrectly omitted from Figure 1, part C. To see the complete article with the corrected figure, please go to http://dx.doi.org/10.1037/a0014681.] People make trait inferences based on facial appearance despite little evidence that these inferences accurately reflect personality. The authors tested the hypothesis that these inferences are driven in part by structural resemblance to emotional expressions. The authors first had participants judge emotionally neutral faces on a set of trait dimensions. The authors then submitted the face images to a Bayesian network classifier trained to detect emotional expressions. By using a classifier, the authors can show that neutral faces perceived to possess various personality traits contain objective resemblance to emotional expression. In general, neutral faces that are perceived to have positive valence resemble happiness, faces that are perceived to have negative valence resemble disgust and fear, and faces that are perceived to be threatening resemble anger. These results support the idea that trait inferences are in part the result of an overgeneralization of emotion recognition systems. Under this hypothesis, emotion recognition systems, which typically extract accurate information about a person's emotional state, are engaged during the perception of neutral faces that bear subtle resemblance to emotional expressions. These emotions could then be misattributed as traits. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Current brain models of emotion processing hypothesize that positive (or approach-related) emotions are lateralized towards the left hemisphere, whereas negative (or withdrawal-related) emotions are lateralized towards the right hemisphere. Brain imaging studies, however, have so far failed to document such hemispheric lateralization. In a functional magnetic resonance imaging (fMRI) study, 14 female subjects viewed alternating blocks of emotionally valenced positive and negative pictures. When the experience of valence was equated for arousal, overall brain reactivity was lateralized towards the left hemisphere for positive pictures and towards the right hemisphere for negative pictures. This study provides direct support for the valence hypothesis, under conditions of equivalent arousal, by means of functional brain imaging.  相似文献   

19.
The common assumption that emotional expression mediates the course of bereavement is tested. Competing hypotheses about the direction of mediation were formulated from the grief work and social-functional accounts of emotional expression. Facial expressions of emotion in conjugally bereaved adults were coded at 6 months post-loss as they described their relationship with the deceased; grief and perceived health were measured at 6, 14, and 25 months. Facial expressions of negative emotion, in particular anger, predicted increased grief at 14 months and poorer perceived health through 25 months. Facial expressions of positive emotion predicted decreased grief through 25 months and a positive but nonsignificant relation to perceived health. Predictive relations between negative and positive emotional expression persisted when initial levels of self-reported emotion, grief, and health were statistically controlled, demonstrating the mediating role of facial expressions of emotion in adjustment to conjugal loss. Theoretical and clinical implications are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
On separate visits to the laboratory, 36 nine-month-old infants (18 boys and 18 girls) watched their mothers express joy or sadness, facially and vocally, during a 2-min emotion-induction period. After the induction period, mothers continued to express joy or sadness while their infants played with four sets of toys. Infant emotion expressions were analyzed using the Max (Izard, 1979a) and Affex (Izard, Dougherty, & Hembree, 1983) coding systems, and infant play behavior was coded with a system developed by Belsky and Most (1981). The amount of time that the infants looked at their mothers was also measured. Findings were generally consistent with differential emotions theory (Izard, 1979b). The infants expressed more joy and looked longer at their mothers during the joy condition and they showed more sadness, anger, and gaze aversion during the sadness condition. The infants engaged in more play behavior in the joy condition than in the sadness condition. Regression analyses revealed several significant relations between infants' gaze behavior, emotion expressions, and play behavior. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号