首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 945 毫秒
1.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

2.
Previous binocular rivalry studies with younger adults have shown that emotional stimuli dominate perception over neutral stimuli. Here we investigated the effects of age on patterns of emotional dominance during binocular rivalry. Participants performed a face/house rivalry task where the emotion of the face (happy, angry, neutral) and orientation (upright, inverted) of the face and house stimuli were varied systematically. Age differences were found with younger adults showing a general emotionality effect (happy and angry faces were more dominant than neutral faces) and older adults showing inhibition of anger (neutral faces were more dominant than angry faces) and positivity effects (happy faces were more dominant than both angry and neutral faces). Age differences in dominance patterns were reflected by slower rivalry rates for both happy and angry compared to neutral face/house pairs in younger adults, and slower rivalry rates for happy compared to both angry and neutral face/house pairs in older adults. Importantly, these patterns of emotional dominance and slower rivalry rates for emotional-face/house pairs disappeared when the stimuli were inverted. This suggests that emotional valence, and not low-level image features, were responsible for the emotional bias in both age groups. Given that binocular rivalry has a limited role for voluntary control, the findings imply that anger suppression and positivity effects in older adults may extend to more automatic tasks. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

3.
Studies have found that older compared with young adults are less able to identify facial expressions and have worse memory for negative than for positive faces, but those studies have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age differences in processing faces may not extend to older faces, and preferential memory for own age faces may not extend to emotional faces. To investigate these possibilities, young and older participants viewed young and older faces presented either with happy, angry, or neutral expressions; participants identified the expressions displayed and then completed a surprise face recognition task. Older compared with young participants were less able to identify expressions of angry young and older faces and (based on participants’ categorizations) remembered angry faces less well than happy faces. There was no evidence of an own age bias in memory, but self-reported frequency of contact with young and older adults and awareness of own emotions played a role in expression identification of and memory for young and older faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Decision making is influenced by social cues, but there is little understanding of how social information interacts with other cues that determine decisions. To address this quantitatively, participants were asked to learn which of two faces was associated with a higher probability of reward. They were repeatedly presented with two faces, each with a different, unknown probability of reward, and participants attempted to maximize gains by selecting the face that was most often rewarded. Both faces had the same identity, but one face had a happy expression and the other had either an angry or a sad expression. Ideal observer models predict that the facial expressions should not affect the decision-making process. Our results however showed that participants had a prior disposition to select the happy face when it was paired with the angry but not the sad face and overweighted the positive outcomes associated with happy faces and underweighted positive outcomes associated with either angry or sad faces. Nevertheless, participants also integrated the feedback information. As such, their decisions were a composite of social and utilitarian factors. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Converging data suggest that human facial behavior has an evolutionary basis. Combining these data with M. E. Seligman's (1970) preparedness theory, it was predicted that facial expressions of anger should be more readily associated with aversive events than should expressions of happiness. Two experiments involving differential electrodermal conditioning to pictures of faces, with electric shock as the unconditioned stimulus, were performed. In the 1st experiment, 32 undergraduates were exposed to 2 pictures of the same person, 1 with an angry and 1 with a happy expression. For half of the Ss, the shock followed the angry face, and for the other half, it followed the happy face. In the 2nd experiment, 3 groups of 48 undergraduates differentiated between pictures of male and female faces, both showing angry, neutral, and happy expressions. Responses to angry CSs showed significant resistance to extinction in both experiments, with a larger effect in Exp II. Responses to happy or neutral CSs, on the other hand, extinguished immediately when the shock was withheld. Results are related to conditioning to phobic stimuli and to the preparedness theory. (22 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
The decrease in recognition performance after face inversion has been taken to suggest that faces are processed holistically. Three experiments, 1 with schematic and 2 with photographic faces, were conducted to assess whether face inversion also affected visual search for and implicit evaluation of facial expressions of emotion. The 3 visual search experiments yielded the same differences in detection speed between different facial expressions of emotion for upright and inverted faces. Threat superiority effects, faster detection of angry than of happy faces among neutral background faces, were evident in 2 experiments. Face inversion did not affect explicit or implicit evaluation of face stimuli as assessed with verbal ratings and affective priming. Happy faces were evaluated as more positive than angry, sad, or fearful/scheming ones regardless of orientation. Taken together these results seem to suggest that the processing of facial expressions of emotion is not impaired if holistic processing is disrupted. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

8.
In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
The “face in the crowd effect” refers to the finding that threatening or angry faces are detected more efficiently among a crowd of distractor faces than happy or nonthreatening faces. Work establishing this effect has primarily utilized schematic stimuli and efforts to extend the effect to real faces have yielded inconsistent results. The failure to consistently translate the effect from schematic to human faces raises questions about its ecological validity. The present study assessed the face in the crowd effect using a visual search paradigm that placed veridical faces, verified to exemplify prototypical emotional expressions, within heterogeneous crowds. Results confirmed that angry faces were found more quickly and accurately than happy expressions in crowds of both neutral and emotional distractors. These results are the first to extend the face in the crowd effect beyond homogenous crowds to more ecologically valid conditions and thus provide compelling evidence for its legitimacy as a naturalistic phenomenon. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Is it easier to detect angry or happy facial expressions in crowds of faces? The present studies used several variations of the visual search task to assess whether people selectively attend to expressive faces. Contrary to widely cited studies (e.g., ?hman, Lundqvist, & Esteves, 2001) that suggest angry faces “pop out” of crowds, our review of the literature found inconsistent evidence for the effect and suggested that low-level visual confounds could not be ruled out as the driving force behind the anger superiority effect. We then conducted 7 experiments, carefully designed to eliminate many of the confounding variables present in past demonstrations. These experiments showed no evidence that angry faces popped out of crowds or even that they were efficiently detected. These experiments instead revealed a search asymmetry favoring happy faces. Moreover, in contrast to most previous studies, the happiness superiority effect was shown to be robust even when obvious perceptual confounds—like the contrast of white exposed teeth that are typically displayed in smiling faces—were eliminated in the happy targets. Rather than attribute this effect to the existence of innate happiness detectors, we speculate that the human expression of happiness has evolved to be more visually discriminable because its communicative intent is less ambiguous than other facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

11.
Event-related potentials were used to examine the recognition of happy and angry faces by 4- to 6-year-old children. In 2 experiments, Ss viewed 100-ms presentations of a happy face and an angry face posed by a single model. The frequency with which these expressions were presented varied across experiments, and which face served as the target or nontarget stimulus varied within experiments. In Experiment 1, an early negative component (N400) was observed that distinguished between the 2 expressions, and a 2nd, later positive component (P700) was observed that distinguished between target and nontarget events. In Experiment 2, these components were again observed, although both now distinguished only between low- and high-probability events. Both were absent at posterior scalp, were most prominent at parietal and central scalp, and were minimal at frontal scalp. These results are discussed in the context of children's allocation of attentional and memory resources for briefly presented affective stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Examined intermodal perception of vocal and facial expressions in 2 experiments with 16 5- and 16 7-mo-olds. Two filmed facial expressions were presented with a single vocal expression characteristic of 1 of the facial expressions (angry or happy). The lower third of each face was obscured, so Ss could not simply match lip movements to the voice. Overall findings indicate that only 7-mo-olds increased their fixation to a facial expression when it was sound-specified. Older infants evidently detected information that was invariant across the presentations of a single affective expression, despite degradation of temporal synchrony information. The 5-mo-olds' failure to look differentially is explained by the possibilities that (1) 5-mo-olds may need to see the whole face for any discrimination of expressions to occur; (2) they cannot discriminate films of happy and angry facial expressions even with the full face available; or (3) they rely heavily on temporal information for the discrimination of facial expressions and/or the intermodal perception of bimodally presented expressions, although not for articulatory patterns. Preferences for a particular expression were not found: Infants did not look longer at the happy or the angry facial expression, independent of the sound manipulation, suggesting that preferences for happy expressions found in prior studies may rest on attention to the "toothy" smile. (25 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
We establish attentional capture by emotional distractor faces presented as a “singleton” in a search task in which the emotion is entirely irrelevant. Participants searched for a male (or female) target face among female (or male) faces and indicated whether the target face was tilted to the left or right. The presence (vs. absence) of an irrelevant emotional singleton expression (fearful, angry, or happy) in one of the distractor faces slowed search reaction times compared to the singleton absent or singleton target conditions. Facilitation for emotional singleton targets was found for the happy expression but not for the fearful or angry expressions. These effects were found irrespective of face gender and the failure of a singleton neutral face to capture attention among emotional faces rules out a visual odd-one-out account for the emotional capture. The present study thus establishes irrelevant, emotional, attentional capture. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

14.
Seven experiments investigated the finding that threatening schematic faces are detected more quickly than nonthreatening faces. Threatening faces with v-shaped eyebrows (angry and scheming expressions) were detected more quickly than nonthreatening faces with A-shaped eyebrows (happy and sad expressions). In contrast to the hypothesis that these effects were due to perceptual features unrelated to the face, no advantage was found for v-shaped eyebrows presented in a nonfacelike object. Furthermore, the addition of internal facial features (the eyes, or the nose and mouth) was necessary to produce the detection advantage for faces with v-shaped eyebrows. Overall, the results are interpreted as showing that the v-shaped eyebrow configuration affords easy detection, but only when other internal facial features are present. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Traditional models of face processing posit independent pathways for the processing of facial identity and facial expression (e.g., Bruce & Young, 1986). However, such models have been questioned by recent reports that suggest positive expressions may facilitate recognition (e.g., Baudouin et al., 2000), although little attention has been paid to the role of negative expressions. The current study used eye movement indicators to examine the influence of emotional expression (angry, happy, neutral) on the recognition of famous and novel faces. In line with previous research, the authors found some evidence that only happy expressions facilitate the processing of famous faces. However, the processing of novel faces was enhanced by the presence of an angry expression. Contrary to previous findings, this paper suggests that angry expressions also have an important role in the recognition process, and that the influence of emotional expression is modulated by face familiarity. The implications of this finding are discussed in relation to (1) current models of face processing, and (2) theories of oculomotor control in the viewing of facial stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Although some views of face perception posit independent processing of face identity and expression, recent studies suggest interactive processing of these 2 domains. The authors examined expression–identity interactions in visual short-term memory (VSTM) by assessing recognition performance in a VSTM task in which face identity was relevant and expression was irrelevant. Using study arrays of between 1 and 4 faces and a 1,000-ms retention interval, the authors measured recognition accuracy for just-seen faces. Results indicated that significantly more angry face identities can be stored in VSTM than happy or neutral face identities. Furthermore, the study provides evidence to exclude accounts for this angry face benefit based on physiological arousal, opportunity to encode, face discriminability, low-level feature recognition, expression intensity, or specific face sets. Perhaps processes activated by the presence of specifically angry expressions enhance VSTM because memory for the identities of angry people has particular behavioral relevance. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
The authors used connectionist modeling to extend previous research on emotion overgeneralization effects. Study 1 demonstrated that neutral expression male faces objectively resemble angry expressions more than female faces do, female faces objectively resemble surprise expressions more than male faces do, White faces objectively resemble angry expressions more than Black or Korean faces do, and Black faces objectively resemble happy and surprise expressions more than White faces do. Study 2 demonstrated that objective resemblance to emotion expressions influences trait impressions even when statistically controlling possible confounding influences of attractiveness and babyfaceness. It further demonstrated that emotion overgeneralization is moderated by face race and that racial differences in emotion resemblance contribute to White perceivers’ stereotypes of Blacks and Asians. These results suggest that intergroup relations may be strained not only by cultural stereotypes but also by adaptive responses to emotion expressions that are overgeneralized to groups whose faces subtly resemble particular emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Recent studies have suggested that older individuals selectively forget negative information. However, findings on a positivity effect in the attention of older adults have been more mixed. In the current study, eye tracking was used to record visual fixation in nearly real-time to investigate whether older individuals show a positivity effect in their visual attention to emotional information. Young and old individuals (N = 64) viewed pairs of synthetic faces that included the same face in a nonemotional expression and in 1 of 4 emotional expressions (happiness, sadness, anger, or fear). Gaze patterns were recorded as individuals viewed the face pairs. Older adults showed an attentional preference toward happy faces and away from angry ones; the only preference shown by young adults was toward afraid faces. The age groups were not different in overall cognitive functioning, suggesting that these attentional differences are specific and motivated rather than due to general cognitive change with age. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Two experiments competitively test 3 potential mechanisms (negativity inhibiting responses, feature-based accounts, and evaluative context) for the response latency advantage for recognizing happy expressions by investigating how the race of a target can moderate the strength of the effect. Both experiments indicate that target race modulates the happy face advantage, such that European American participants displayed the happy face advantage for White target faces, but displayed a response latency advantage for angry (Experiments 1 and 2) and sad (Experiment 2) Black target faces. This pattern of findings is consistent with an evaluative context mechanism and inconsistent with negativity inhibition and feature-based accounts of the happy face advantage. Thus, the race of a target face provides an evaluative context in which facial expressions are categorized. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号