首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
We investigated age differences in biased recognition of happy, neutral, or angry faces in 4 experiments. Experiment 1 revealed increased true and false recognition for happy faces in older adults, which persisted even when changing each face’s emotional expression from study to test in Experiment 2. In Experiment 3, we examined the influence of reduced memory capacity on the positivity-induced recognition bias, which showed the absence of emotion-induced memory enhancement but a preserved recognition bias for positive faces in patients with amnestic mild cognitive impairment compared with older adults with normal memory performance. In Experiment 4, we used semantic differentials to measure the connotations of happy and angry faces. Younger and older participants regarded happy faces as more familiar than angry faces, but the older group showed a larger recognition bias for happy faces. This finding indicates that older adults use a gist-based memory strategy based on a semantic association between positive emotion and familiarity. Moreover, older adults’ judgments of valence were more positive for both angry and happy faces, supporting the hypothesis of socioemotional selectivity. We propose that the positivity-induced recognition bias might be based on fluency, which in turn is based on both positivity-oriented emotional goals and on preexisting semantic associations. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Studies have found that older compared with young adults are less able to identify facial expressions and have worse memory for negative than for positive faces, but those studies have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age differences in processing faces may not extend to older faces, and preferential memory for own age faces may not extend to emotional faces. To investigate these possibilities, young and older participants viewed young and older faces presented either with happy, angry, or neutral expressions; participants identified the expressions displayed and then completed a surprise face recognition task. Older compared with young participants were less able to identify expressions of angry young and older faces and (based on participants’ categorizations) remembered angry faces less well than happy faces. There was no evidence of an own age bias in memory, but self-reported frequency of contact with young and older adults and awareness of own emotions played a role in expression identification of and memory for young and older faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
The authors used connectionist modeling to extend previous research on emotion overgeneralization effects. Study 1 demonstrated that neutral expression male faces objectively resemble angry expressions more than female faces do, female faces objectively resemble surprise expressions more than male faces do, White faces objectively resemble angry expressions more than Black or Korean faces do, and Black faces objectively resemble happy and surprise expressions more than White faces do. Study 2 demonstrated that objective resemblance to emotion expressions influences trait impressions even when statistically controlling possible confounding influences of attractiveness and babyfaceness. It further demonstrated that emotion overgeneralization is moderated by face race and that racial differences in emotion resemblance contribute to White perceivers’ stereotypes of Blacks and Asians. These results suggest that intergroup relations may be strained not only by cultural stereotypes but also by adaptive responses to emotion expressions that are overgeneralized to groups whose faces subtly resemble particular emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as “threat-related,” because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
We investigated the effect of subliminally presented happy or angry faces on evaluative judgments when the facial muscles of participants were free to mimic or blocked. We hypothesized and showed that subliminally presented happy expressions lead to more positive judgments of cartoons compared to angry expressions only when facial muscles were not blocked. These results reveal the influence of socially driven embodied processes on affective judgments and have also potential implications for phenomena such as emotional contagion. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

7.
A new model of mental representation is applied to social cognition: the attractor field model. Using the model, the authors predicted and found a perceptual advantage but a memory disadvantage for faces displaying evaluatively congruent expressions. In Experiment 1, participants completed a same/different perceptual discrimination task involving morphed pairs of angry-to-happy Black and White faces. Pairs of faces displaying evaluatively incongruent expressions (i.e., happy Black, angry White) were more likely to be labeled as similar and were less likely to be accurately discriminated from one another than faces displaying evaluatively congruent expressions (i.e., angry Black, happy White). Experiment 2 replicated this finding and showed that objective discriminability of stimuli moderated the impact of attractor field effects on perceptual discrimination accuracy. In Experiment 3, participants completed a recognition task for angry and happy Black and White faces. Consistent with the attractor field model, memory accuracy was better for faces displaying evaluatively incongruent expressions. Theoretical and practical implications of these findings are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

9.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

10.
The authors previously reported that normal subjects are better at discriminating happy from neutral faces when the happy face is located to the viewer's right of the neutral face; conversely, discrimination of sad from neutral faces is better when the sad face is shown to the left, supporting a role for the left hemisphere in processing positive valence and for the right hemisphere in processing negative valence. Here, the authors extend this same task to subjects with unilateral cerebral damage (31 right, 28 left). Subjects with right damage performed worse when discriminating sad faces shown on the left, consistent with the prior findings. However, subjects with either left or right damage actually performed superior to normal controls when discriminating happy faces shown on the left. The authors suggest that perception of negative valence relies preferentially on the right hemisphere, whereas perception of positive valence relies on both left and right hemispheres. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
This study investigated the identification of facial expressions of emotion in currently nondepressed participants who had a history of recurrent depressive episodes (recurrent major depression; RMD) and never-depressed control participants (CTL). Following a negative mood induction, participants were presented with faces whose expressions slowly changed from neutral to full intensity. Identification of facial expressions was measured by the intensity of the expression at which participants could accurately identify whether faces expressed happiness, sadness, or anger. There were no group differences in the identification of sad or angry expressions. Compared with CTL participants, however, RMD participants required significantly greater emotional intensity in the faces to correctly identify happy expressions. These results indicate that biases in the processing of emotional facial expressions are evident even after individuals have recovered from a depressive episode. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
An information-processing paradigm was used to examine attentional biases in clinically depressed participants, participants with generalized anxiety disorder (GAD), and nonpsychiatric control participants for faces expressing sadness, anger, and happiness. Faces were presented for 1,000 ms, at which point depressed participants had directed their attention selectively to depression-relevant (i.e., sad) faces. This attentional bias was specific to the emotion of sadness; the depressed participants did not exhibit attentional biases to the angry or happy faces. This bias was also specific to depression; at 1,000 ms, participants with GAD were not attending selectively to sad, happy, or anxiety-relevant (i.e., angry) faces. Implications of these findings for both the cognitive and the interpersonal functioning of depressed individuals are discussed and directions for future research are advanced. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Two studies examined the general prediction that one's emotional expression should facilitate memory for material that matches the expression. The authors focused on specific facial expressions of surprise. In the first study, participants who were mimicking a surprised expression showed better recall for the surprising words and worse recall for neutral words, relative to those who were mimicking a neutral expression. Study 2 replicated the results of Study 1, showing that participants who mimicked a surprised expression recalled more words spoken in a surprising manner compared with those that sounded neutral or sad. Conversely, participants who mimicked sad facial expressions showed greater recall for sad than neutral or surprising words. The results provide evidence of the importance of matching the emotional valence of the recall content to the facial expression of the recaller during the memorization period. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
In a face-in-the-crowd setting, the authors examined visual search for photographically reproduced happy, angry, and fearful target faces among neutral distractor faces in 3 separate experiments. Contrary to the hypothesis, happy targets were consistently detected more quickly and accurately than angry and fearful targets, as were directed compared with averted targets. There was no consistent effect of social anxiety. A facial emotion recognition experiment suggested that the happy search advantage could be due to the ease of processing happy faces. In the final experiment with perceptually controlled schematic faces, the authors reported more effective detection of angry than happy faces. This angry advantage was most obvious for highly socially anxious individuals when their social fear was experimentally enhanced. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Decision making is influenced by social cues, but there is little understanding of how social information interacts with other cues that determine decisions. To address this quantitatively, participants were asked to learn which of two faces was associated with a higher probability of reward. They were repeatedly presented with two faces, each with a different, unknown probability of reward, and participants attempted to maximize gains by selecting the face that was most often rewarded. Both faces had the same identity, but one face had a happy expression and the other had either an angry or a sad expression. Ideal observer models predict that the facial expressions should not affect the decision-making process. Our results however showed that participants had a prior disposition to select the happy face when it was paired with the angry but not the sad face and overweighted the positive outcomes associated with happy faces and underweighted positive outcomes associated with either angry or sad faces. Nevertheless, participants also integrated the feedback information. As such, their decisions were a composite of social and utilitarian factors. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
We investigated how emotionality of visual background context influenced perceptual ratings of faces. In two experiments participants rated how positive or negative a face, with a neutral expression (Experiment 1), or unambiguous emotional expression (happy/angry; Experiment 2), appeared when viewed overlaid onto positive, negative, or neutral background context scenes. Faces viewed in a positive context were rated as appearing more positive than when in a neutral or negative context, and faces in negative contexts were rated more negative than when in a positive or neutral context, regardless of the emotional expression portrayed. Notably, congruency of valence in face expression and background context significantly influenced face ratings. These findings suggest that human judgements of faces are relative, and significantly influenced by contextual factors. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
The “face in the crowd effect” refers to the finding that threatening or angry faces are detected more efficiently among a crowd of distractor faces than happy or nonthreatening faces. Work establishing this effect has primarily utilized schematic stimuli and efforts to extend the effect to real faces have yielded inconsistent results. The failure to consistently translate the effect from schematic to human faces raises questions about its ecological validity. The present study assessed the face in the crowd effect using a visual search paradigm that placed veridical faces, verified to exemplify prototypical emotional expressions, within heterogeneous crowds. Results confirmed that angry faces were found more quickly and accurately than happy expressions in crowds of both neutral and emotional distractors. These results are the first to extend the face in the crowd effect beyond homogenous crowds to more ecologically valid conditions and thus provide compelling evidence for its legitimacy as a naturalistic phenomenon. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号