首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Reports an error in "Structural resemblance to emotional expressions predicts evaluation of emotionally neutral faces" by Christopher P. Said, Nicu Sebe and Alexander Todorov (Emotion, 2009[Apr], Vol 9[2], 260-264). In this article a symbol was incorrectly omitted from Figure 1, part C. To see the complete article with the corrected figure, please go to http://dx.doi.org/10.1037/a0014681. (The following abstract of the original article appeared in record 2009-04472-011.) People make trait inferences based on facial appearance despite little evidence that these inferences accurately reflect personality. The authors tested the hypothesis that these inferences are driven in part by structural resemblance to emotional expressions. The authors first had participants judge emotionally neutral faces on a set of trait dimensions. The authors then submitted the face images to a Bayesian network classifier trained to detect emotional expressions. By using a classifier, the authors can show that neutral faces perceived to possess various personality traits contain objective resemblance to emotional expression. In general, neutral faces that are perceived to have positive valence resemble happiness, faces that are perceived to have negative valence resemble disgust and fear, and faces that are perceived to be threatening resemble anger. These results support the idea that trait inferences are in part the result of an overgeneralization of emotion recognition systems. Under this hypothesis, emotion recognition systems, which typically extract accurate information about a person's emotional state, are engaged during the perception of neutral faces that bear subtle resemblance to emotional expressions. These emotions could then be misattributed as traits. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
[Correction Notice: An erratum for this article was reported in Vol 9(4) of Emotion (see record 2009-11528-009). In this article a symbol was incorrectly omitted from Figure 1, part C. To see the complete article with the corrected figure, please go to http://dx.doi.org/10.1037/a0014681.] People make trait inferences based on facial appearance despite little evidence that these inferences accurately reflect personality. The authors tested the hypothesis that these inferences are driven in part by structural resemblance to emotional expressions. The authors first had participants judge emotionally neutral faces on a set of trait dimensions. The authors then submitted the face images to a Bayesian network classifier trained to detect emotional expressions. By using a classifier, the authors can show that neutral faces perceived to possess various personality traits contain objective resemblance to emotional expression. In general, neutral faces that are perceived to have positive valence resemble happiness, faces that are perceived to have negative valence resemble disgust and fear, and faces that are perceived to be threatening resemble anger. These results support the idea that trait inferences are in part the result of an overgeneralization of emotion recognition systems. Under this hypothesis, emotion recognition systems, which typically extract accurate information about a person's emotional state, are engaged during the perception of neutral faces that bear subtle resemblance to emotional expressions. These emotions could then be misattributed as traits. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Findings of 7 studies suggested that decisions about the sex of a face and the emotional expressions of anger or happiness are not independent: Participants were faster and more accurate at detecting angry expressions on male faces and at detecting happy expressions on female faces. These findings were robust across different stimulus sets and judgment tasks and indicated bottom-up perceptual processes rather than just top-down conceptually driven ones. Results from additional studies in which neutrally expressive faces were used suggested that the connections between masculine features and angry expressions and between feminine features and happy expressions might be a property of the sexual dimorphism of the face itself and not merely a result of gender stereotypes biasing the perception. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

4.
A new model of mental representation is applied to social cognition: the attractor field model. Using the model, the authors predicted and found a perceptual advantage but a memory disadvantage for faces displaying evaluatively congruent expressions. In Experiment 1, participants completed a same/different perceptual discrimination task involving morphed pairs of angry-to-happy Black and White faces. Pairs of faces displaying evaluatively incongruent expressions (i.e., happy Black, angry White) were more likely to be labeled as similar and were less likely to be accurately discriminated from one another than faces displaying evaluatively congruent expressions (i.e., angry Black, happy White). Experiment 2 replicated this finding and showed that objective discriminability of stimuli moderated the impact of attractor field effects on perceptual discrimination accuracy. In Experiment 3, participants completed a recognition task for angry and happy Black and White faces. Consistent with the attractor field model, memory accuracy was better for faces displaying evaluatively incongruent expressions. Theoretical and practical implications of these findings are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Corrugator supercilii muscle activity is considered an objective measure of valence because it increases in response to negatively valenced facial expressions (angry) and decreases to positive expressions (happy). The authors sought to determine if corrugator activity could be used as an objective measure of positivity-negativity bias. The authors recorded corrugator responses as participants rated angry, happy, and surprised faces as “positive” or “negative.” The critical measure of bias was the percentage of positive versus negative ratings assigned to surprised faces by each participant. Reaction times for surprise expressions were longer than for happy and angry expressions, consistent with their ambiguous valence. Participants who tended to rate surprised faces as negative showed increased corrugator activity to surprised faces, whereas those who tended to rate surprise as positive showed decreased activity. Critically, corrugator responses reflected the participants’ bias (i.e., their tendency to rate surprise as positive or negative). These data show that surprised faces constitute a useful tool for assessing individual differences in positivity-negativity bias, and that corrugator activity can objectively reflect this bias. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Gaze direction influences younger adults' perception of emotional expressions, with direct gaze enhancing the perception of anger and joy, while averted gaze enhances the perception of fear. Age-related declines in emotion recognition and eye-gaze processing have been reported, indicating that there may be age-related changes in the ability to integrate these facial cues. As there is evidence of a positivity bias with age, age-related difficulties integrating these cues may be greatest for negative emotions. The present research investigated age differences in the extent to which gaze direction influenced explicit perception (e.g., anger, fear and joy; Study 1) and social judgments (e.g., of approachability; Study 2) of emotion faces. Gaze direction did not influence the perception of fear in either age group. In both studies, age differences were found in the extent to which gaze direction influenced judgments of angry and joyful faces, with older adults showing less integration of gaze and emotion cues than younger adults. Age differences were greatest when interpreting angry expressions. Implications of these findings for older adults' social functioning are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
We report two studies validating a new standardized set of filmed emotion expressions, the Amsterdam Dynamic Facial Expression Set (ADFES). The ADFES is distinct from existing datasets in that it includes a face-forward version and two different head-turning versions (faces turning toward and away from viewers), North-European as well as Mediterranean models (male and female), and nine discrete emotions (joy, anger, fear, sadness, surprise, disgust, contempt, pride, and embarrassment). Study 1 showed that the ADFES received excellent recognition scores. Recognition was affected by social categorization of the model: displays of North-European models were better recognized by Dutch participants, suggesting an ingroup advantage. Head-turning did not affect recognition accuracy. Study 2 showed that participants more strongly perceived themselves to be the cause of the other's emotion when the model's face turned toward the respondents. The ADFES provides new avenues for research on emotion expression and is available for researchers upon request. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

8.
Reliability, content, and homogeneity of own- and other-race impressions were assessed: US White, US Black, and Korean students rated faces of White, Black, or Korean men. High intraracial reliabilities revealed that people of one race showed equally high agreement regarding the traits of own- and other-race faces. Racially universal appearance stereotypes (the attractiveness halo effect and the babyface overgeneralization effect) contributed substantially to interracial agreement, which was only marginally lower than intraracial agreement. Moreover, similar attention to variations in appearance yielded similar degrees of own- and other-race trait differentiation. When own- and other-race differences in the differentiation of faces on babyfaceness were statistically controlled, differences in trait differentiation were eliminated. Despite the individual impressions of other-race faces, certain racial stereotypes persisted. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Across 6 studies, factors signaling potential vulnerability to harm produced a bias toward outgroup categorization—a tendency to categorize unfamiliar others as members of an outgroup rather than as members of one's ingroup. Studies 1 through 4 demonstrated that White participants were more likely to categorize targets as Black (as opposed to White) when those targets displayed cues heuristically associated with threat (masculinity, movement toward the perceiver, and facial expressions of anger). In Study 5, White participants who felt chronically vulnerable to interpersonal threats responded to a fear manipulation by categorizing threatening (angry) faces as Black rather than White. Study 6 extended these findings to a minimal group paradigm, in which participants who felt chronically vulnerable to interpersonal threats categorized threatening (masculine) targets as outgroup members. Together, findings indicate that ecologically relevant threat cues within both the target and the perceiver interact to bias the way people initially parse the social world into ingroup vs. outgroup. Findings support a threat-based framework for intergroup psychology. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Studies have found that older compared with young adults are less able to identify facial expressions and have worse memory for negative than for positive faces, but those studies have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age differences in processing faces may not extend to older faces, and preferential memory for own age faces may not extend to emotional faces. To investigate these possibilities, young and older participants viewed young and older faces presented either with happy, angry, or neutral expressions; participants identified the expressions displayed and then completed a surprise face recognition task. Older compared with young participants were less able to identify expressions of angry young and older faces and (based on participants’ categorizations) remembered angry faces less well than happy faces. There was no evidence of an own age bias in memory, but self-reported frequency of contact with young and older adults and awareness of own emotions played a role in expression identification of and memory for young and older faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Examined the initial, unstructured interactions of 40 interracial (Black–White) dyads in which 3 factors were systematically varied. These factors were the disposition of the White dyad members to either seek out or shun interaction with Blacks, the race (Black vs White) of the experimenter, and the gender composition (male–male vs female–female) of the dyads. Results show that within dyads, White dyad members displayed more interactional involvement than their Black partners but experienced the interactions as more stressful and uncomfortable. Whites predisposed to avoid interaction with Blacks looked and smiled at their partners less than those predisposed to initiate interaction. Both Black and White members of these avoidance dyads reported heightened feelings of anxiety and concern about their interactions, but the moderating influences of the Whites' approach–avoidance dispositions on interaction behavior were essentially limited to conditions in which the experimenter was Black and the White S was a "solo minority." It is suggested that Black–White partner effects are attributable to differing amounts of cross-race contact typically experienced by Blacks and Whites. Black–White experimenter effects are interpreted in terms of S. E. Taylor's (1981) hypothesis that stereotypes and related dispositions are activated in social contexts in which group membership is made salient. (38 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as “threat-related,” because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

13.
The present study investigated age-related variations in judgments of the duration of angry facial expressions compared with neutral facial expressions. Children aged 3, 5, and 8 years were tested on a temporal bisection task using angry and neutral female faces. Results revealed that, in all age groups, children judged the duration of angry faces to be longer than that of neutral faces. Findings are discussed in the framework of internal clock models and the adaptive function of emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Results from 2 experimental studies suggest that self-protection and mate-search goals lead to the perception of functionally relevant emotional expressions in goal-relevant social targets. Activating a self-protection goal led participants to perceive greater anger in Black male faces (Study 1) and Arab faces (Study 2), both out-groups heuristically associated with physical threat. In Study 2, participants' level of implicit Arab-threat associations moderated this bias. Activating a mate-search goal led male, but not female, participants to perceive more sexual arousal in attractive opposite-sex targets (Study 1). Activating these goals did not influence perceptions of goal-irrelevant targets. Additionally, participants with chronic self-protective and mate-search goals exhibited similar biases. Findings are consistent with a functionalist, motivation-based account of interpersonal perception. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
This study investigates the discrimination accuracy of emotional stimuli in subjects with major depression compared with healthy controls using photographs of facial expressions of varying emotional intensities. The sample included 88 unmedicated male and female subjects, aged 18–56 years, with major depressive disorder (n = 44) or no psychiatric illness (n = 44), who judged the emotion of 200 facial pictures displaying an expression between 10% (90% neutral) and 80% (nuanced) emotion. Stimuli were presented in 10% increments to generate a range of intensities, each presented for a 500-ms duration. Compared with healthy volunteers, depressed subjects showed very good recognition accuracy for sad faces but impaired recognition accuracy for other emotions (e.g., harsh, surprise, and sad expressions) of subtle emotional intensity. Recognition accuracy improved for both groups as a function of increased intensity on all emotions. Finally, as depressive symptoms increased, recognition accuracy increased for sad faces, but decreased for surprised faces. Moreover, depressed subjects showed an impaired ability to accurately identify subtle facial expressions, indicating that depressive symptoms influence accuracy of emotional recognition. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
What does the “facial expression of disgust” communicate to children? When asked to label the emotion conveyed by different facial expressions widely used in research, children (N = 84, 4 to 9 years) were much more likely to label the “disgust face” as anger than as disgust, indeed just as likely as they were to label the “angry face” as anger. Shown someone with a disgust face and asked to generate a possible cause and consequence of that emotion, children provided answers indistinguishable from what they provided for an angry face—even for the minority who had labeled the disgust face as disgust. A majority of adults (N = 22) labeled the same disgust faces shown to the children as disgust and generated causes and consequences that implied disgust. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
The current series of studies provide converging evidence that facial expressions of fear and anger may have co-evolved to mimic mature and babyish faces in order to enhance their communicative signal. In Studies 1 and 2, fearful and angry facial expressions were manipulated to have enhanced babyish features (larger eyes) or enhanced mature features (smaller eyes) and in the context of a speeded categorization task in Study 1 and a visual noise paradigm in Study 2, results indicated that larger eyes facilitated the recognition of fearful facial expressions, while smaller eyes facilitated the recognition of angry facial expressions. Study 3 manipulated facial roundness, a stable structure that does not vary systematically with expressions, and found that congruency between maturity and expression (narrow face-anger; round face-fear) facilitated expression recognition accuracy. Results are discussed as representing a broad co-evolutionary relationship between facial maturity and fearful and angry facial expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
The authors examined whether the negative behavior of 1 Black male would influence White participants' perceptions of Black Americans and behavior toward another Black person. In Study 1, it was found that participants in the Black-negative condition tended to stereotype Blacks more than participants in the Black-control condition did. It was also found that participants who had observed a negative behavior, whether it was performed by a Black or a White confederate, avoided a subsequently encountered Black person more often than did participants in either the positive condition or the control condition. In a 2nd study, interpersonal interactions with a Black person were minimized only after participants observed the negative behavior of a Black confederate. Study 3 extended the findings of Study 1 by showing that group level stereotypes and the expression of ingroup favoritism resulted from simply overhearing a conversation in which a Black person was alleged to have committed a crime. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
The decrease in recognition performance after face inversion has been taken to suggest that faces are processed holistically. Three experiments, 1 with schematic and 2 with photographic faces, were conducted to assess whether face inversion also affected visual search for and implicit evaluation of facial expressions of emotion. The 3 visual search experiments yielded the same differences in detection speed between different facial expressions of emotion for upright and inverted faces. Threat superiority effects, faster detection of angry than of happy faces among neutral background faces, were evident in 2 experiments. Face inversion did not affect explicit or implicit evaluation of face stimuli as assessed with verbal ratings and affective priming. Happy faces were evaluated as more positive than angry, sad, or fearful/scheming ones regardless of orientation. Taken together these results seem to suggest that the processing of facial expressions of emotion is not impaired if holistic processing is disrupted. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Two experiments competitively test 3 potential mechanisms (negativity inhibiting responses, feature-based accounts, and evaluative context) for the response latency advantage for recognizing happy expressions by investigating how the race of a target can moderate the strength of the effect. Both experiments indicate that target race modulates the happy face advantage, such that European American participants displayed the happy face advantage for White target faces, but displayed a response latency advantage for angry (Experiments 1 and 2) and sad (Experiment 2) Black target faces. This pattern of findings is consistent with an evaluative context mechanism and inconsistent with negativity inhibition and feature-based accounts of the happy face advantage. Thus, the race of a target face provides an evaluative context in which facial expressions are categorized. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号