首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This study was designed to examine attentional biases in the processing of emotional faces in currently and formerly depressed participants and healthy controls. Using a dot-probe task, the authors presented faces expressing happy or sad emotions paired with emotionally neutral faces. Whereas both currently and formerly depressed participants selectively attended to the sad faces, the control participants selectively avoided the sad faces and oriented toward the happy faces, a positive bias that was not observed for either of the depressed groups. These results indicate that attentional biases in the processing of emotional faces are evident even after individuals have recovered from a depressive episode. Implications of these findings for understanding the roles of cognitive and interpersonal functioning in depression are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
The claim that specific discrete emotions can be universally recognized from human facial expressions is based mainly on the study of expressions that were posed. The current study (N=50) examined recognition of emotion from 20 spontaneous expressions from Papua New Guinea photographed, coded, and labeled by P. Ekman (1980). For the 16 faces with a single predicted label, endorsement of that label ranged from 4.2% to 45.8% (mean 24.2%). For 4 faces with 2 predicted labels (blends), endorsement of one or the other ranged from 6.3% to 66.6% (mean 38.8%). Of the 24 labels Ekman predicted, 11 were endorsed at an above-chance level, and 13 were not. Spontaneous expressions do not achieve the level of recognition achieved by posed expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
The authors previously reported that normal subjects are better at discriminating happy from neutral faces when the happy face is located to the viewer's right of the neutral face; conversely, discrimination of sad from neutral faces is better when the sad face is shown to the left, supporting a role for the left hemisphere in processing positive valence and for the right hemisphere in processing negative valence. Here, the authors extend this same task to subjects with unilateral cerebral damage (31 right, 28 left). Subjects with right damage performed worse when discriminating sad faces shown on the left, consistent with the prior findings. However, subjects with either left or right damage actually performed superior to normal controls when discriminating happy faces shown on the left. The authors suggest that perception of negative valence relies preferentially on the right hemisphere, whereas perception of positive valence relies on both left and right hemispheres. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
The present electromyographic study is a first step toward shedding light on the involvement of affective processes in congruent and incongruent facial reactions to facial expressions. Further, empathy was investigated as a potential mediator underlying the modulation of facial reactions to emotional faces in a competitive, a cooperative, and a neutral setting. Results revealed less congruent reactions to happy expressions and even incongruent reactions to sad and angry expressions in the competition condition, whereas virtually no differences between the neutral and the cooperation condition occurred. Effects on congruent reactions were found to be mediated by cognitive empathy, indicating that the state of empathy plays an important role in the situational modulation of congruent reactions. Further, incongruent reactions to sad and angry faces in a competition setting were mediated by the emotional reaction of joy, supporting the assumption that incongruent facial reactions are mainly based on affective processes. Additionally, strategic processes (specifically, the goal to create and maintain a smooth, harmonious interaction) were found to influence facial reactions while being in a cooperative mindset. Now, further studies are needed to test for the generalizability of these effects. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

5.
[Correction Notice: An erratum for this article was reported in Vol 7(4) of Emotion (see record 2007-17748-022). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum.] Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

7.
Binocular rivalry is the perceptual alternation between two incompatible stimuli presented simultaneously but to each eye separately. The observer's perception switches back and forth between the two stimuli that are competing for perceptual dominance. In two studies, pictures of emotional faces (disgust and happy) were pitted against each other or against pictures of faces with neutral expressions. Study 1 demonstrated that (a) emotional facial expressions predominate over neutral expressions, and (b) positive facial expressions predominate over negative facial expression (i.e., positivity bias). Study 2 examined individual differences in emotional predominance and positivity bias during binocular rivalry. Although the positivity bias was not affected by the levels of depressive symptoms, results demonstrated that emotional predominance diminished as the level of depressive symptoms increased. These results indicate that individuals who report more depressive symptoms compared to their less depressed counterparts tend to assign more meaning to neutral faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
A forced-choice intensity judgment task was used to investigate biases in the processing of subtle expressions of emotion in participants with major depressive disorder (MDD). Participants were presented with 2 pictures of the same actor side by side, either depicting a neutral and a subtle emotional expression or depicting a subtle positive and a subtle negative expression. Participants were asked to indicate which of the 2 pictures showed the stronger emotion. Compared with participants with social anxiety disorder (SAD) and with never-disordered controls (CTLs), participants with MDD were less likely to judge subtle happy expressions as more intense than neutral expressions. In addition, compared with the CTL participants, participants who had MDD and participants who had SAD were less likely to judge subtle happy expressions to be more intense than negative expressions. Biases in the judgment of the intensity of subtle expressions of positive affect could play an important role in the interpersonal difficulties that are associated with depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
The study tests the hypothesis of an embodied associative triangle among relative tone pitch (i.e., high or low tones), vertical movement, and facial emotion. In particular, it is tested whether relative pitch automatically activates facial expressions of happiness and anger as well as vertical head movements. Results show robust congruency effects: happiness expressions and upward head tilts are imitated faster when paired with high rather than low tones, while anger expressions and downward head tilts are imitated faster when paired with low rather than high tones. The results add to the growing evidence favoring an embodiment account that emphasizes multimodal representations as the basis of cognition, emotion, and action. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

12.
The study of the spontaneous expressions of blind individuals offers a unique opportunity to understand basic processes concerning the emergence and source of facial expressions of emotion. In this study, the authors compared the expressions of congenitally and noncongenitally blind athletes in the 2004 Paralympic Games with each other and with those produced by sighted athletes in the 2004 Olympic Games. The authors also examined how expressions change from 1 context to another. There were no differences between congenitally blind, noncongenitally blind, and sighted athletes, either on the level of individual facial actions or in facial emotion configurations. Blind athletes did produce more overall facial activity, but these were isolated to head and eye movements. The blind athletes' expressions differentiated whether they had won or lost a medal match at 3 different points in time, and there were no cultural differences in expression. These findings provide compelling evidence that the production of spontaneous facial expressions of emotion is not dependent on observational learning but simultaneously demonstrates a learned component to the social management of expressions, even among blind individuals. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Affective conflict and control may have important parallels to cognitive conflict and control, but these processes have been difficult to quantitatively study with emotionally naturalistic laboratory paradigms. The current study examines a modification of the AX-Continuous Performance Task (AX-CPT), a well-validated probe of cognitive conflict and control, for the study of emotional conflict. In the Emotional AX-CPT, speeded emotional facial expressions measured with electromyography (EMG) were used as the primary response modality, and index of emotional conflict. Bottom-up emotional conflict occurred on trials in which precued facial expressions were incongruent with the valence of an emotionally evocative picture probe (e.g., smiling to a negative picture). A second form of top-down conflict occurred in which the facial expression and picture probe were congruent, but the opposite expression was expected based on the precue. A matched version of the task was also performed (in a separate group of participants) with affectively neutral probe stimuli. Behavioral interference was observed, in terms of response latencies and errors, on all conflict trials. However, bottom-up conflict was stronger in the emotional version of the task compared to the neutral version; top-down conflict was similar across the two versions. The results suggest that voluntary facial expressions may be more sensitive to indexing emotional than nonemotional conflict, and importantly, may provide an ecologically valid method of examining how emotional conflict may manifest in behavior and brain activity. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Individuals suffering from depression show diminished facial responses to positive stimuli. Recent cognitive research suggests that depressed individuals may appraise emotional stimuli differently than do nondepressed persons. Prior studies do not indicate whether depressed individuals respond differently when they encounter positive stimuli that are difficult to avoid. The authors investigated dynamic responses of individuals varying in both history of major depressive disorder (MDD) and current depressive symptomatology (N = 116) to robust positive stimuli. The Facial Action Coding System (Ekman & Friesen, 1978) was used to measure affect-related responses to a comedy clip. Participants reporting current depressive symptomatology were more likely to evince affect-related shifts in expression following the clip than were those without current symptomatology. This effect of current symptomatology emerged even when the contrast focused only on individuals with a history of MDD. Specifically, persons with current depressive symptomatology were more likely than those without current symptomatology to control their initial smiles with negative affect-related expressions. These findings suggest that integration of emotion science and social cognition may yield important advances for understanding depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
The decrease in recognition performance after face inversion has been taken to suggest that faces are processed holistically. Three experiments, 1 with schematic and 2 with photographic faces, were conducted to assess whether face inversion also affected visual search for and implicit evaluation of facial expressions of emotion. The 3 visual search experiments yielded the same differences in detection speed between different facial expressions of emotion for upright and inverted faces. Threat superiority effects, faster detection of angry than of happy faces among neutral background faces, were evident in 2 experiments. Face inversion did not affect explicit or implicit evaluation of face stimuli as assessed with verbal ratings and affective priming. Happy faces were evaluated as more positive than angry, sad, or fearful/scheming ones regardless of orientation. Taken together these results seem to suggest that the processing of facial expressions of emotion is not impaired if holistic processing is disrupted. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
This article examines the importance of semantic processes in the recognition of emotional expressions, through a series of three studies on false recognition. The first study found a high frequency of false recognition of prototypical expressions of emotion when participants viewed slides and video clips of nonprototypical fearful and happy expressions. The second study tested whether semantic processes caused false recognition. The authors found that participants made significantly higher error rates when asked to detect expressions that corresponded to semantic labels than when asked to detect visual stimuli. Finally, given that previous research reported that false memories are less prevalent in younger children, the third study tested whether false recognition of prototypical expressions increased with age. The authors found that 67% of eight- to nine-year-old children reported nonpresent prototypical expressions of fear in a fearful context, but only 40% of 6- to 7-year-old children did so. Taken together, these three studies demonstrate the importance of semantic processes in the detection and categorization of prototypical emotional expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
We investigated adults' voluntary control of 20 facial action units theoretically associated with 6 basic emotions (happiness, fear, anger, surprise, sadness, and disgust). Twenty young adults were shown video excerpts of facial action units and asked to reproduce them as accurately as possible. Facial Action Coding System (FACS; Ekman & Friesen, 1978a) coding of the facial productions showed that young adults succeeded in activating 18 of the 20 target actions units, although they often coactivated other action units. Voluntary control was clearly better for some action units than for others, with a pattern of differences between action units consistent with previous work in children and adolescents. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Researchers have documented that children of depressed mothers are at elevated risk for developing a depressive disorder themselves. There is currently little understanding, however, of what factors place these children at elevated risk. In the present study, the authors investigated whether never-disordered daughters whose mothers have experienced recurrent episodes of depression during their daughters' lifetime are characterized by biased processing of emotional information. Following a negative mood induction, participants completed an emotional-faces dot-probe task. Daughters at elevated risk for depression, but not control daughters of never-disordered mothers, selectively attended to negative facial expressions. In contrast, only control daughters selectively attended to positive facial expressions. These results provide support for cognitive vulnerability models of depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
An extensive literature credits the right hemisphere with dominance for processing emotion. Conflicting literature finds left hemisphere dominance for positive emotions. This conflict may be resolved by attending to processing stage. A divided output (bimanual) reaction time paradigm in which response hand was varied for emotion (angry; happy) in Experiments 1 and 2 and for gender (male; female) in Experiment 3 focused on response to emotion rather than perception. In Experiments 1 and 2, reaction time was shorter when right-hand responses indicated a happy face and left-hand responses an angry face, as compared to reversed assignment. This dissociation did not obtain with incidental emotion (Experiment 3). Results support the view that response preparation to positive emotional stimuli is left lateralized. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
This exploratory study aims at investigating the effects of terrorism on children’s ability to recognize emotions. A sample of 101 exposed and 102 nonexposed children (mean age = 11 years), balanced for age and gender, were assessed 20 months after a terrorist attack in Beslan, Russia. Two trials controlled for children’s ability to match a facial emotional stimulus with an emotional label and their ability to match an emotional label with an emotional context. The experimental trial evaluated the relation between exposure to terrorism and children’s free labeling of mixed emotion facial stimuli created by morphing between 2 prototypical emotions. Repeated measures analyses of covariance revealed that exposed children correctly recognized pure emotions. Four log-linear models were performed to explore the association between exposure group and category of answer given in response to different mixed emotion facial stimuli. Model parameters indicated that, compared with nonexposed children, exposed children (a) labeled facial expressions containing anger and sadness significantly more often than expected as anger, and (b) produced fewer correct answers in response to stimuli containing sadness as a target emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号