首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 937 毫秒
1.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

2.
Cognitive models of psychopathology posit that the content or focus of information-processing biases (e.g., attentional biases) is disorder specific: Depression is hypothesized to be characterized by attentional biases specifically for depression-relevant stimuli (e.g., sad facial expressions), whereas anxiety should relate particularly to attentional biases to threat-relevant stimuli (e.g., angry faces). However, little research has investigated this specificity hypothesis and none with a sample of youths. The present study examined attentional biases to emotional faces (sad, angry, and happy compared with neutral) in groups of pure depressed, pure anxious, comorbid depressed and anxious, and control youths (ages 9–17 years; N = 161). Consistent with cognitive models, pure depressed and pure anxious youths exhibited attentional biases specifically to sad and angry faces, respectively, whereas comorbid youths exhibited attentional biases to both facial expressions. In addition, control youths exhibited attentional avoidance of sad faces, and comorbid boys avoided happy faces. Overall, findings suggest that cognitive biases and processing of particular emotional information are specific to pure clinical depression and anxiety, and results inform etiological models of potentially specific processes that are associated with internalizing disorders among youths. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

4.
Theoretical models of attention for affective information have assigned a special status to the cognitive processing of emotional facial expressions. One specific claim in this regard is that emotional faces automatically attract visual attention. In three experiments, the authors investigated attentional cueing by angry, happy, and neutral facial expressions that were presented under conditions of limited awareness. In these experiments, facial expressions were presented in a masked (14 ms or 34 ms, masked by a neutral face) and unmasked fashion (34 ms or 100 ms). Compared with trials containing neutral cues, delayed responding was found on trials with emotional cues in the unmasked, 100-ms condition, suggesting stronger allocation of cognitive resources to emotional faces. However, in both masked and unmasked conditions, the hypothesized cueing of visual attention to the location of emotional facial expression was not found. In contrary, attentional cueing by emotional faces was less strong compared with neutral faces in the unmasked, 100-ms condition. These data suggest that briefly presented emotional faces influence cognitive processing but do not automatically capture visual attention. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Studies have found that older compared with young adults are less able to identify facial expressions and have worse memory for negative than for positive faces, but those studies have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age differences in processing faces may not extend to older faces, and preferential memory for own age faces may not extend to emotional faces. To investigate these possibilities, young and older participants viewed young and older faces presented either with happy, angry, or neutral expressions; participants identified the expressions displayed and then completed a surprise face recognition task. Older compared with young participants were less able to identify expressions of angry young and older faces and (based on participants’ categorizations) remembered angry faces less well than happy faces. There was no evidence of an own age bias in memory, but self-reported frequency of contact with young and older adults and awareness of own emotions played a role in expression identification of and memory for young and older faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Neuroimaging data suggest that emotional information, especially threatening faces, automatically captures attention and receives rapid processing. While this is consistent with the majority of behavioral data, behavioral studies of the attentional blink (AB) additionally reveal that aversive emotional first target (T1) stimuli are associated with prolonged attentional engagement or “dwell” time. One explanation for this difference is that few AB studies have utilized manipulations of facial emotion as the T1. To address this, schematic faces varying in expression (neutral, angry, happy) served as the T1 in the current research. Results revealed that the blink associated with an angry T1 face was, primarily, of greater magnitude than that associated with either a neutral or happy T1 face, and also that initial recovery from this processing bias was faster following angry, compared with happy, T1 faces. The current data therefore provide important information regarding the time-course of attentional capture by angry faces: Angry faces are associated with both the rapid capture and rapid release of attention. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

7.
In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
The decrease in recognition performance after face inversion has been taken to suggest that faces are processed holistically. Three experiments, 1 with schematic and 2 with photographic faces, were conducted to assess whether face inversion also affected visual search for and implicit evaluation of facial expressions of emotion. The 3 visual search experiments yielded the same differences in detection speed between different facial expressions of emotion for upright and inverted faces. Threat superiority effects, faster detection of angry than of happy faces among neutral background faces, were evident in 2 experiments. Face inversion did not affect explicit or implicit evaluation of face stimuli as assessed with verbal ratings and affective priming. Happy faces were evaluated as more positive than angry, sad, or fearful/scheming ones regardless of orientation. Taken together these results seem to suggest that the processing of facial expressions of emotion is not impaired if holistic processing is disrupted. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
The authors examined face perception models with regard to the functional and temporal organization of facial identity and expression analysis. Participants performed a manual 2-choice go/no-go task to classify faces, where response hand depended on facial familiarity (famous vs. unfamiliar) and response execution depended on facial expression (happy vs. angry). Behavioral and electrophysiological markers of information processing—in particular, the lateralized readiness potential (LRP)—were recorded to assess the time course of facial identity and expression processing. The duration of facial identity and expression processes was manipulated in separate experiments, which allowed testing the differential predictions of alternative face perception models. Together, the reaction time and LRP findings indicate a parallel architecture of facial identity and expression analysis in which the analysis of facial expression relies on information about identity. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as “threat-related,” because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

12.
Previous choice reaction time studies have provided consistent evidence for faster recognition of positive (e.g., happy) than negative (e.g., disgusted) facial expressions. A predominance of positive emotions in normal contexts may partly explain this effect. The present study used pleasant and unpleasant odors to test whether emotional context affects the happy face advantage. Results from 2 experiments indicated that happiness was recognized faster than disgust in a pleasant context, but this advantage disappeared in an unpleasant context because of the slow recognition of happy faces. Odors may modulate the functioning of those emotion-related brain structures that participate in the formation of the perceptual representations of the facial expressions and in the generation of the conceptual knowledge associated with the signaled emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Decision making is influenced by social cues, but there is little understanding of how social information interacts with other cues that determine decisions. To address this quantitatively, participants were asked to learn which of two faces was associated with a higher probability of reward. They were repeatedly presented with two faces, each with a different, unknown probability of reward, and participants attempted to maximize gains by selecting the face that was most often rewarded. Both faces had the same identity, but one face had a happy expression and the other had either an angry or a sad expression. Ideal observer models predict that the facial expressions should not affect the decision-making process. Our results however showed that participants had a prior disposition to select the happy face when it was paired with the angry but not the sad face and overweighted the positive outcomes associated with happy faces and underweighted positive outcomes associated with either angry or sad faces. Nevertheless, participants also integrated the feedback information. As such, their decisions were a composite of social and utilitarian factors. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
This study investigated the identification of facial expressions of emotion in currently nondepressed participants who had a history of recurrent depressive episodes (recurrent major depression; RMD) and never-depressed control participants (CTL). Following a negative mood induction, participants were presented with faces whose expressions slowly changed from neutral to full intensity. Identification of facial expressions was measured by the intensity of the expression at which participants could accurately identify whether faces expressed happiness, sadness, or anger. There were no group differences in the identification of sad or angry expressions. Compared with CTL participants, however, RMD participants required significantly greater emotional intensity in the faces to correctly identify happy expressions. These results indicate that biases in the processing of emotional facial expressions are evident even after individuals have recovered from a depressive episode. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Although some views of face perception posit independent processing of face identity and expression, recent studies suggest interactive processing of these 2 domains. The authors examined expression–identity interactions in visual short-term memory (VSTM) by assessing recognition performance in a VSTM task in which face identity was relevant and expression was irrelevant. Using study arrays of between 1 and 4 faces and a 1,000-ms retention interval, the authors measured recognition accuracy for just-seen faces. Results indicated that significantly more angry face identities can be stored in VSTM than happy or neutral face identities. Furthermore, the study provides evidence to exclude accounts for this angry face benefit based on physiological arousal, opportunity to encode, face discriminability, low-level feature recognition, expression intensity, or specific face sets. Perhaps processes activated by the presence of specifically angry expressions enhance VSTM because memory for the identities of angry people has particular behavioral relevance. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
We establish attentional capture by emotional distractor faces presented as a “singleton” in a search task in which the emotion is entirely irrelevant. Participants searched for a male (or female) target face among female (or male) faces and indicated whether the target face was tilted to the left or right. The presence (vs. absence) of an irrelevant emotional singleton expression (fearful, angry, or happy) in one of the distractor faces slowed search reaction times compared to the singleton absent or singleton target conditions. Facilitation for emotional singleton targets was found for the happy expression but not for the fearful or angry expressions. These effects were found irrespective of face gender and the failure of a singleton neutral face to capture attention among emotional faces rules out a visual odd-one-out account for the emotional capture. The present study thus establishes irrelevant, emotional, attentional capture. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

17.
We investigated age differences in biased recognition of happy, neutral, or angry faces in 4 experiments. Experiment 1 revealed increased true and false recognition for happy faces in older adults, which persisted even when changing each face’s emotional expression from study to test in Experiment 2. In Experiment 3, we examined the influence of reduced memory capacity on the positivity-induced recognition bias, which showed the absence of emotion-induced memory enhancement but a preserved recognition bias for positive faces in patients with amnestic mild cognitive impairment compared with older adults with normal memory performance. In Experiment 4, we used semantic differentials to measure the connotations of happy and angry faces. Younger and older participants regarded happy faces as more familiar than angry faces, but the older group showed a larger recognition bias for happy faces. This finding indicates that older adults use a gist-based memory strategy based on a semantic association between positive emotion and familiarity. Moreover, older adults’ judgments of valence were more positive for both angry and happy faces, supporting the hypothesis of socioemotional selectivity. We propose that the positivity-induced recognition bias might be based on fluency, which in turn is based on both positivity-oriented emotional goals and on preexisting semantic associations. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Is it easier to detect angry or happy facial expressions in crowds of faces? The present studies used several variations of the visual search task to assess whether people selectively attend to expressive faces. Contrary to widely cited studies (e.g., ?hman, Lundqvist, & Esteves, 2001) that suggest angry faces “pop out” of crowds, our review of the literature found inconsistent evidence for the effect and suggested that low-level visual confounds could not be ruled out as the driving force behind the anger superiority effect. We then conducted 7 experiments, carefully designed to eliminate many of the confounding variables present in past demonstrations. These experiments showed no evidence that angry faces popped out of crowds or even that they were efficiently detected. These experiments instead revealed a search asymmetry favoring happy faces. Moreover, in contrast to most previous studies, the happiness superiority effect was shown to be robust even when obvious perceptual confounds—like the contrast of white exposed teeth that are typically displayed in smiling faces—were eliminated in the happy targets. Rather than attribute this effect to the existence of innate happiness detectors, we speculate that the human expression of happiness has evolved to be more visually discriminable because its communicative intent is less ambiguous than other facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

19.
Converging data suggest that human facial behavior has an evolutionary basis. Combining these data with M. E. Seligman's (1970) preparedness theory, it was predicted that facial expressions of anger should be more readily associated with aversive events than should expressions of happiness. Two experiments involving differential electrodermal conditioning to pictures of faces, with electric shock as the unconditioned stimulus, were performed. In the 1st experiment, 32 undergraduates were exposed to 2 pictures of the same person, 1 with an angry and 1 with a happy expression. For half of the Ss, the shock followed the angry face, and for the other half, it followed the happy face. In the 2nd experiment, 3 groups of 48 undergraduates differentiated between pictures of male and female faces, both showing angry, neutral, and happy expressions. Responses to angry CSs showed significant resistance to extinction in both experiments, with a larger effect in Exp II. Responses to happy or neutral CSs, on the other hand, extinguished immediately when the shock was withheld. Results are related to conditioning to phobic stimuli and to the preparedness theory. (22 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号