首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Cognitive models of psychopathology posit that the content or focus of information-processing biases (e.g., attentional biases) is disorder specific: Depression is hypothesized to be characterized by attentional biases specifically for depression-relevant stimuli (e.g., sad facial expressions), whereas anxiety should relate particularly to attentional biases to threat-relevant stimuli (e.g., angry faces). However, little research has investigated this specificity hypothesis and none with a sample of youths. The present study examined attentional biases to emotional faces (sad, angry, and happy compared with neutral) in groups of pure depressed, pure anxious, comorbid depressed and anxious, and control youths (ages 9–17 years; N = 161). Consistent with cognitive models, pure depressed and pure anxious youths exhibited attentional biases specifically to sad and angry faces, respectively, whereas comorbid youths exhibited attentional biases to both facial expressions. In addition, control youths exhibited attentional avoidance of sad faces, and comorbid boys avoided happy faces. Overall, findings suggest that cognitive biases and processing of particular emotional information are specific to pure clinical depression and anxiety, and results inform etiological models of potentially specific processes that are associated with internalizing disorders among youths. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
An information-processing paradigm was used to examine attentional biases in clinically depressed participants, participants with generalized anxiety disorder (GAD), and nonpsychiatric control participants for faces expressing sadness, anger, and happiness. Faces were presented for 1,000 ms, at which point depressed participants had directed their attention selectively to depression-relevant (i.e., sad) faces. This attentional bias was specific to the emotion of sadness; the depressed participants did not exhibit attentional biases to the angry or happy faces. This bias was also specific to depression; at 1,000 ms, participants with GAD were not attending selectively to sad, happy, or anxiety-relevant (i.e., angry) faces. Implications of these findings for both the cognitive and the interpersonal functioning of depressed individuals are discussed and directions for future research are advanced. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
This study investigated the role of neutral, happy, fearful, and angry facial expressions in enhancing orienting to the direction of eye gaze. Photographs of faces with either direct or averted gaze were presented. A target letter (T or L) appeared unpredictably to the left or the right of the face, either 300 ms or 700 ms after gaze direction changed. Response times were faster in congruent conditions (i.e., when the eyes gazed toward the target) relative to incongruent conditions (when the eyes gazed away from the target letter). Facial expression did influence reaction times, but these effects were qualified by individual differences in self-reported anxiety. High trait-anxious participants showed an enhanced orienting to the eye gaze of faces with fearful expressions relative to all other expressions. In contrast, when the eyes stared straight ahead, trait anxiety was associated with slower responding when the facial expressions depicted anger. Thus, in anxiety-prone people attention is more likely to be held by an expression of anger, whereas attention is guided more potently by fearful facial expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Traditional models of face processing posit independent pathways for the processing of facial identity and facial expression (e.g., Bruce & Young, 1986). However, such models have been questioned by recent reports that suggest positive expressions may facilitate recognition (e.g., Baudouin et al., 2000), although little attention has been paid to the role of negative expressions. The current study used eye movement indicators to examine the influence of emotional expression (angry, happy, neutral) on the recognition of famous and novel faces. In line with previous research, the authors found some evidence that only happy expressions facilitate the processing of famous faces. However, the processing of novel faces was enhanced by the presence of an angry expression. Contrary to previous findings, this paper suggests that angry expressions also have an important role in the recognition process, and that the influence of emotional expression is modulated by face familiarity. The implications of this finding are discussed in relation to (1) current models of face processing, and (2) theories of oculomotor control in the viewing of facial stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
This study investigated the time course of attentional responses to emotional facial expressions in a clinical sample with social phobia. With a visual probe task, photographs of angry, happy, and neutral faces were presented at 2 exposure durations: 500 and 1,250 ms. At 500 ms, the social phobia group showed enhanced vigilance for angry faces, relative to happy and neutral faces, in comparison with normal controls. In the 1,250-ms condition, there were no significant attentional biases in the social phobia group. Results are consistent with a bias in initial orienting to threat cues in social anxiety. Findings are discussed in relation to recent cognitive models of anxiety disorders. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Recent studies have suggested that mood-incongruency effects are due to mood-regulatory processes, in which people retrieve positive memories to repair negative moods. In Study 1, the authors investigated whether dysphoria influences the accessibility of autobiographical memories following a positive or a negative mood induction combined with subsequent rumination or distraction. The results showed a mood-repair effect for nondysphoric but not for dysphoric participants following rumination. In Study 2, participants were asked to either distract themselves or to recall positive autobiographical memories after a negative mood induction. Whereas nondysphoric participants' mood improved under both conditions, dysphoric participants' mood improved only after distraction. These results suggest that dysphoria is associated with a reduced ability to use mood-incongruent recall to repair sad moods. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
The decrease in recognition performance after face inversion has been taken to suggest that faces are processed holistically. Three experiments, 1 with schematic and 2 with photographic faces, were conducted to assess whether face inversion also affected visual search for and implicit evaluation of facial expressions of emotion. The 3 visual search experiments yielded the same differences in detection speed between different facial expressions of emotion for upright and inverted faces. Threat superiority effects, faster detection of angry than of happy faces among neutral background faces, were evident in 2 experiments. Face inversion did not affect explicit or implicit evaluation of face stimuli as assessed with verbal ratings and affective priming. Happy faces were evaluated as more positive than angry, sad, or fearful/scheming ones regardless of orientation. Taken together these results seem to suggest that the processing of facial expressions of emotion is not impaired if holistic processing is disrupted. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

12.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

13.
The present electromyographic study is a first step toward shedding light on the involvement of affective processes in congruent and incongruent facial reactions to facial expressions. Further, empathy was investigated as a potential mediator underlying the modulation of facial reactions to emotional faces in a competitive, a cooperative, and a neutral setting. Results revealed less congruent reactions to happy expressions and even incongruent reactions to sad and angry expressions in the competition condition, whereas virtually no differences between the neutral and the cooperation condition occurred. Effects on congruent reactions were found to be mediated by cognitive empathy, indicating that the state of empathy plays an important role in the situational modulation of congruent reactions. Further, incongruent reactions to sad and angry faces in a competition setting were mediated by the emotional reaction of joy, supporting the assumption that incongruent facial reactions are mainly based on affective processes. Additionally, strategic processes (specifically, the goal to create and maintain a smooth, harmonious interaction) were found to influence facial reactions while being in a cooperative mindset. Now, further studies are needed to test for the generalizability of these effects. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

14.
Socioemotional selectivity theory postulates that with age, people are motivated to derive emotional meaning from life, leading them to pay more attention to positive relative to negative/neutral stimuli. The authors argue that cultures that differ in what they consider to be emotionally meaningful may show this preference to different extents. Using eye-tracking techniques, the authors compared visual attention toward emotional (happy, fearful, sad, and angry) and neutral facial expressions among 46 younger and 57 older Hong Kong Chinese. In contrast to prior Western findings, older but not younger Chinese looked away from happy facial expressions, suggesting that they do not show attentional preferences toward positive stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
This study investigated whether the disengagement of attention from facial expression is modulated by gaze direction in infants. To this end, we measured the saccadic reaction time required for the 10-month-olds to disengage their attention from angry and happy expressions combined with either straight or averted gaze. The 10-month-olds' disengagement of their attention from happy faces was modulated by gaze direction. This finding indicates that gaze direction strongly influences infants' allocation of attention to facial expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Theoretical models of attention for affective information have assigned a special status to the cognitive processing of emotional facial expressions. One specific claim in this regard is that emotional faces automatically attract visual attention. In three experiments, the authors investigated attentional cueing by angry, happy, and neutral facial expressions that were presented under conditions of limited awareness. In these experiments, facial expressions were presented in a masked (14 ms or 34 ms, masked by a neutral face) and unmasked fashion (34 ms or 100 ms). Compared with trials containing neutral cues, delayed responding was found on trials with emotional cues in the unmasked, 100-ms condition, suggesting stronger allocation of cognitive resources to emotional faces. However, in both masked and unmasked conditions, the hypothesized cueing of visual attention to the location of emotional facial expression was not found. In contrary, attentional cueing by emotional faces was less strong compared with neutral faces in the unmasked, 100-ms condition. These data suggest that briefly presented emotional faces influence cognitive processing but do not automatically capture visual attention. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
This study investigated the identification of facial expressions of emotion in currently nondepressed participants who had a history of recurrent depressive episodes (recurrent major depression; RMD) and never-depressed control participants (CTL). Following a negative mood induction, participants were presented with faces whose expressions slowly changed from neutral to full intensity. Identification of facial expressions was measured by the intensity of the expression at which participants could accurately identify whether faces expressed happiness, sadness, or anger. There were no group differences in the identification of sad or angry expressions. Compared with CTL participants, however, RMD participants required significantly greater emotional intensity in the faces to correctly identify happy expressions. These results indicate that biases in the processing of emotional facial expressions are evident even after individuals have recovered from a depressive episode. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
We investigated facial recognition memory (for previously unfamiliar faces) and facial expression perception with functional magnetic resonance imaging (fMRI). Eight healthy, right-handed volunteers participated. For the facial recognition task, subjects made a decision as to the familiarity of each of 50 faces (25 previously viewed; 25 novel). We detected signal increase in the right middle temporal gyrus and left prefrontal cortex during presentation of familiar faces, and in several brain regions, including bilateral posterior cingulate gyri, bilateral insulae and right middle occipital cortex during presentation of unfamiliar faces. Standard facial expressions of emotion were used as stimuli in two further tasks of facial expression perception. In the first task, subjects were presented with alternating happy and neutral faces; in the second task, subjects were presented with alternating sad and neutral faces. During presentation of happy facial expressions, we detected a signal increase predominantly in the left anterior cingulate gyrus, bilateral posterior cingulate gyri, medial frontal cortex and right supramarginal gyrus, brain regions previously implicated in visuospatial and emotion processing tasks. No brain regions showed increased signal intensity during presentation of sad facial expressions. These results provide evidence for a distinction between the neural correlates of facial recognition memory and perception of facial expression but, whilst highlighting the role of limbic structures in perception of happy facial expressions, do not allow the mapping of a distinct neural substrate for perception of sad facial expressions.  相似文献   

19.
Empirical evidence shows an effect of gaze direction on cueing spatial attention, regardless of the emotional expression shown by a face, whereas a combined effect of gaze direction and facial expression has been observed on individuals' evaluative judgments. In 2 experiments, the authors investigated whether gaze direction and facial expression affect spatial attention depending upon the presence of an evaluative goal. Disgusted, fearful, happy, or neutral faces gazing left or right were followed by positive or negative target words presented either at the spatial location looked at by the face or at the opposite spatial location. Participants responded to target words based on affective valence (i.e., positive/negative) in Experiment 1 and on letter case (lowercase/uppercase) in Experiment 2. Results showed that participants responded much faster to targets presented at the spatial location looked at by disgusted or fearful faces but only in Experiment 1, when an evaluative task was used. The present findings clearly show that negative facial expressions enhance the attentional shifts due to eye-gaze direction, provided that there was an explicit evaluative goal present. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Neuroimaging data suggest that emotional information, especially threatening faces, automatically captures attention and receives rapid processing. While this is consistent with the majority of behavioral data, behavioral studies of the attentional blink (AB) additionally reveal that aversive emotional first target (T1) stimuli are associated with prolonged attentional engagement or “dwell” time. One explanation for this difference is that few AB studies have utilized manipulations of facial emotion as the T1. To address this, schematic faces varying in expression (neutral, angry, happy) served as the T1 in the current research. Results revealed that the blink associated with an angry T1 face was, primarily, of greater magnitude than that associated with either a neutral or happy T1 face, and also that initial recovery from this processing bias was faster following angry, compared with happy, T1 faces. The current data therefore provide important information regarding the time-course of attentional capture by angry faces: Angry faces are associated with both the rapid capture and rapid release of attention. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号