首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Studies have found that older compared with young adults are less able to identify facial expressions and have worse memory for negative than for positive faces, but those studies have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age differences in processing faces may not extend to older faces, and preferential memory for own age faces may not extend to emotional faces. To investigate these possibilities, young and older participants viewed young and older faces presented either with happy, angry, or neutral expressions; participants identified the expressions displayed and then completed a surprise face recognition task. Older compared with young participants were less able to identify expressions of angry young and older faces and (based on participants’ categorizations) remembered angry faces less well than happy faces. There was no evidence of an own age bias in memory, but self-reported frequency of contact with young and older adults and awareness of own emotions played a role in expression identification of and memory for young and older faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Two experiments competitively test 3 potential mechanisms (negativity inhibiting responses, feature-based accounts, and evaluative context) for the response latency advantage for recognizing happy expressions by investigating how the race of a target can moderate the strength of the effect. Both experiments indicate that target race modulates the happy face advantage, such that European American participants displayed the happy face advantage for White target faces, but displayed a response latency advantage for angry (Experiments 1 and 2) and sad (Experiment 2) Black target faces. This pattern of findings is consistent with an evaluative context mechanism and inconsistent with negativity inhibition and feature-based accounts of the happy face advantage. Thus, the race of a target face provides an evaluative context in which facial expressions are categorized. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
The authors used connectionist modeling to extend previous research on emotion overgeneralization effects. Study 1 demonstrated that neutral expression male faces objectively resemble angry expressions more than female faces do, female faces objectively resemble surprise expressions more than male faces do, White faces objectively resemble angry expressions more than Black or Korean faces do, and Black faces objectively resemble happy and surprise expressions more than White faces do. Study 2 demonstrated that objective resemblance to emotion expressions influences trait impressions even when statistically controlling possible confounding influences of attractiveness and babyfaceness. It further demonstrated that emotion overgeneralization is moderated by face race and that racial differences in emotion resemblance contribute to White perceivers’ stereotypes of Blacks and Asians. These results suggest that intergroup relations may be strained not only by cultural stereotypes but also by adaptive responses to emotion expressions that are overgeneralized to groups whose faces subtly resemble particular emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

5.
The present electromyographic study is a first step toward shedding light on the involvement of affective processes in congruent and incongruent facial reactions to facial expressions. Further, empathy was investigated as a potential mediator underlying the modulation of facial reactions to emotional faces in a competitive, a cooperative, and a neutral setting. Results revealed less congruent reactions to happy expressions and even incongruent reactions to sad and angry expressions in the competition condition, whereas virtually no differences between the neutral and the cooperation condition occurred. Effects on congruent reactions were found to be mediated by cognitive empathy, indicating that the state of empathy plays an important role in the situational modulation of congruent reactions. Further, incongruent reactions to sad and angry faces in a competition setting were mediated by the emotional reaction of joy, supporting the assumption that incongruent facial reactions are mainly based on affective processes. Additionally, strategic processes (specifically, the goal to create and maintain a smooth, harmonious interaction) were found to influence facial reactions while being in a cooperative mindset. Now, further studies are needed to test for the generalizability of these effects. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
We investigated age differences in biased recognition of happy, neutral, or angry faces in 4 experiments. Experiment 1 revealed increased true and false recognition for happy faces in older adults, which persisted even when changing each face’s emotional expression from study to test in Experiment 2. In Experiment 3, we examined the influence of reduced memory capacity on the positivity-induced recognition bias, which showed the absence of emotion-induced memory enhancement but a preserved recognition bias for positive faces in patients with amnestic mild cognitive impairment compared with older adults with normal memory performance. In Experiment 4, we used semantic differentials to measure the connotations of happy and angry faces. Younger and older participants regarded happy faces as more familiar than angry faces, but the older group showed a larger recognition bias for happy faces. This finding indicates that older adults use a gist-based memory strategy based on a semantic association between positive emotion and familiarity. Moreover, older adults’ judgments of valence were more positive for both angry and happy faces, supporting the hypothesis of socioemotional selectivity. We propose that the positivity-induced recognition bias might be based on fluency, which in turn is based on both positivity-oriented emotional goals and on preexisting semantic associations. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Findings of 7 studies suggested that decisions about the sex of a face and the emotional expressions of anger or happiness are not independent: Participants were faster and more accurate at detecting angry expressions on male faces and at detecting happy expressions on female faces. These findings were robust across different stimulus sets and judgment tasks and indicated bottom-up perceptual processes rather than just top-down conceptually driven ones. Results from additional studies in which neutrally expressive faces were used suggested that the connections between masculine features and angry expressions and between feminine features and happy expressions might be a property of the sexual dimorphism of the face itself and not merely a result of gender stereotypes biasing the perception. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

8.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
We investigated how emotionality of visual background context influenced perceptual ratings of faces. In two experiments participants rated how positive or negative a face, with a neutral expression (Experiment 1), or unambiguous emotional expression (happy/angry; Experiment 2), appeared when viewed overlaid onto positive, negative, or neutral background context scenes. Faces viewed in a positive context were rated as appearing more positive than when in a neutral or negative context, and faces in negative contexts were rated more negative than when in a positive or neutral context, regardless of the emotional expression portrayed. Notably, congruency of valence in face expression and background context significantly influenced face ratings. These findings suggest that human judgements of faces are relative, and significantly influenced by contextual factors. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Although some views of face perception posit independent processing of face identity and expression, recent studies suggest interactive processing of these 2 domains. The authors examined expression–identity interactions in visual short-term memory (VSTM) by assessing recognition performance in a VSTM task in which face identity was relevant and expression was irrelevant. Using study arrays of between 1 and 4 faces and a 1,000-ms retention interval, the authors measured recognition accuracy for just-seen faces. Results indicated that significantly more angry face identities can be stored in VSTM than happy or neutral face identities. Furthermore, the study provides evidence to exclude accounts for this angry face benefit based on physiological arousal, opportunity to encode, face discriminability, low-level feature recognition, expression intensity, or specific face sets. Perhaps processes activated by the presence of specifically angry expressions enhance VSTM because memory for the identities of angry people has particular behavioral relevance. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Is it easier to detect angry or happy facial expressions in crowds of faces? The present studies used several variations of the visual search task to assess whether people selectively attend to expressive faces. Contrary to widely cited studies (e.g., ?hman, Lundqvist, & Esteves, 2001) that suggest angry faces “pop out” of crowds, our review of the literature found inconsistent evidence for the effect and suggested that low-level visual confounds could not be ruled out as the driving force behind the anger superiority effect. We then conducted 7 experiments, carefully designed to eliminate many of the confounding variables present in past demonstrations. These experiments showed no evidence that angry faces popped out of crowds or even that they were efficiently detected. These experiments instead revealed a search asymmetry favoring happy faces. Moreover, in contrast to most previous studies, the happiness superiority effect was shown to be robust even when obvious perceptual confounds—like the contrast of white exposed teeth that are typically displayed in smiling faces—were eliminated in the happy targets. Rather than attribute this effect to the existence of innate happiness detectors, we speculate that the human expression of happiness has evolved to be more visually discriminable because its communicative intent is less ambiguous than other facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

12.
Effective filtering of distractor information has been shown to be dependent on perceptual load. Given the salience of emotional information and the presence of emotion-attention interactions, we wanted to explore the recognition memory for emotional distractors especially as a function of focused attention and distributed attention by manipulating load and the spatial spread of attention. We performed two experiments to study emotion-attention interactions by measuring recognition memory performance for distractor neutral and emotional faces. Participants performed a color discrimination task (low-load) or letter identification task (high-load) with a letter string display in Experiment 1 and a high-load letter identification task with letters presented in a circular array in Experiment 2. The stimuli were presented against a distractor face background. The recognition memory results show that happy faces were recognized better than sad faces under conditions of less focused or distributed attention. When attention is more spatially focused, sad faces were recognized better than happy faces. The study provides evidence for emotion-attention interactions in which specific emotional information like sad or happy is associated with focused or distributed attention respectively. Distractor processing with emotional information also has implications for theories of attention. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
The anger-superiority hypothesis states that angry faces are detected more efficiently than friendly faces. Previously research used schematized stimuli, which minimizes perceptual confounds, but violates ecological validity. The authors argue that a confounding of appearance and meaning is unavoidable and even unproblematic if real faces are presented. Four experiments tested carefully controlled photos in a search-asymmetry design. Experiments 1 and 2 revealed more efficient detection of an angry face among happy faces than vice versa. Experiment 3 indicated that the advantage was due to the mouth, but not to the eyes, and Experiment 4, using upright and inverted thatcherized faces, suggests a perceptual basis. The results are in line with a sensory-bias hypothesis that facial expressions evolved to exploit extant capabilities of the visual system. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
The other-race effect (ORE) in face recognition is typically observed in tasks which require long-term memory. Several studies, however, have found the effect early in face encoding (Lindsay, Jack, & Christian, 1991; Walker & Hewstone, 2006). In 6 experiments, with over 300 participants, we found no evidence that the recognition deficit associated with the ORE reflects deficits in immediate encoding. In Experiment 1, with a study-to-test retention interval of 4 min, participants were better able to recognise White faces, relative to Asian faces. Experiment 1 also validated the use of computer-generated faces in subsequent experiments. In Experiments 2 through 4, performance was virtually identical to Asian and White faces in match-to-sample, immediate recognition. In Experiment 5, decreasing target-foil similarity and disrupting the retention interval with trivia questions elicited a re-emergence of the ORE. Experiments 6A and 6B replicated this effect, and showed that memory for Asian faces was particularly susceptible to distraction; White faces were recognised equally well, regardless of trivia questions during the retention interval. The recognition deficit in the ORE apparently emerges from retention or retrieval deficits, not differences in immediate perceptual processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Two experiments used a priming paradigm to investigate the influence of racial cues on the perceptual identification of weapons. In Experiment 1, participants identified guns faster when primed with Black faces compared with White faces. In Experiment 2, participants were required to respond quickly, causing the racial bias to shift from reaction time to accuracy. Participants misidentified tools as guns more often when primed with a Black face than with a White face. L. L. Jacoby's (1991) process dissociation procedure was applied to demonstrate that racial primes influenced automatic (A) processing, but not controlled (C) processing. The response deadline reduced the C estimate but not the A estimate. The motivation to control prejudice moderated the relationship between explicit prejudice and automatic bias. Implications are discussed on applied and theoretical levels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

17.
This study investigated the role of neutral, happy, fearful, and angry facial expressions in enhancing orienting to the direction of eye gaze. Photographs of faces with either direct or averted gaze were presented. A target letter (T or L) appeared unpredictably to the left or the right of the face, either 300 ms or 700 ms after gaze direction changed. Response times were faster in congruent conditions (i.e., when the eyes gazed toward the target) relative to incongruent conditions (when the eyes gazed away from the target letter). Facial expression did influence reaction times, but these effects were qualified by individual differences in self-reported anxiety. High trait-anxious participants showed an enhanced orienting to the eye gaze of faces with fearful expressions relative to all other expressions. In contrast, when the eyes stared straight ahead, trait anxiety was associated with slower responding when the facial expressions depicted anger. Thus, in anxiety-prone people attention is more likely to be held by an expression of anger, whereas attention is guided more potently by fearful facial expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
19.
Previous studies examining event-related potentials and evaluative priming have been mixed; some find evidence that evaluative priming influences the N400, whereas others find evidence that it affects the late positive potential (LPP). Three experiments were conducted using either affective pictures (Experiments 1 and 2) or words (Experiment 3) in a sequential evaluative priming paradigm. In line with previous behavioral findings, participants responded slower to targets that were evaluatively incongruent with the preceding prime (e.g., negative preceded by positive) compared to evaluatively congruent targets (e.g., negative preceded by negative). In all three studies, the LPP was larger to evaluatively incongruent targets compared to evaluatively congruent ones, and there was no evidence that evaluative incongruity influenced the N400 component. Thus, the present results provide additional support for the notion that evaluative priming influences the LPP and not the N400. We discuss possible reasons for the inconsistent findings in prior research and the theoretical implications of the findings for both evaluative and semantic priming. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

20.
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as “threat-related,” because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号