首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Decoding facial expressions of emotion is an important aspect of social communication that is often impaired following psychiatric or neurological illness. However, little is known of the cognitive components involved in perceiving emotional expressions. Three dual task studies explored the role of verbal working memory in decoding emotions. Concurrent working memory load substantially interfered with choosing which emotional label described a facial expression (Experiment 1). A key factor in the magnitude of interference was the number of emotion labels from which to choose (Experiment 2). In contrast the ability to decide that two faces represented the same emotion in a discrimination task was relatively unaffected by concurrent working memory load (Experiment 3). Different methods of assessing emotion perception make substantially different demands on working memory. Implications for clinical disorders which affect both working memory and emotion perception are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
There is mixed evidence on the nature of the relationship between the perception of gaze direction and the perception of facial expressions. Major support for shared processing of gaze and expression comes from behavioral studies that showed that observers cannot process expression or gaze and ignore irrelevant variations in the other dimension. However, these studies have not considered the role of head orientation, which is known to play a key role in the processing of gaze direction. In a series of experiments, the relationship between the processing of expression and gaze was tested both with head orientation held constant and with head orientation varied between trials, making it a relevant source of information for computing gaze direction. Results show that when head orientation varied between trials, the processing of facial expression was not interfered with gaze direction, and conversely, the processing of gaze could be made without being interfered from irrelevant variations in expression. These findings suggest that the processing of gaze and the processing of expression are not functionally interconnected as was previously assumed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Gaze direction influences younger adults' perception of emotional expressions, with direct gaze enhancing the perception of anger and joy, while averted gaze enhances the perception of fear. Age-related declines in emotion recognition and eye-gaze processing have been reported, indicating that there may be age-related changes in the ability to integrate these facial cues. As there is evidence of a positivity bias with age, age-related difficulties integrating these cues may be greatest for negative emotions. The present research investigated age differences in the extent to which gaze direction influenced explicit perception (e.g., anger, fear and joy; Study 1) and social judgments (e.g., of approachability; Study 2) of emotion faces. Gaze direction did not influence the perception of fear in either age group. In both studies, age differences were found in the extent to which gaze direction influenced judgments of angry and joyful faces, with older adults showing less integration of gaze and emotion cues than younger adults. Age differences were greatest when interpreting angry expressions. Implications of these findings for older adults' social functioning are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Empirical evidence shows an effect of gaze direction on cueing spatial attention, regardless of the emotional expression shown by a face, whereas a combined effect of gaze direction and facial expression has been observed on individuals' evaluative judgments. In 2 experiments, the authors investigated whether gaze direction and facial expression affect spatial attention depending upon the presence of an evaluative goal. Disgusted, fearful, happy, or neutral faces gazing left or right were followed by positive or negative target words presented either at the spatial location looked at by the face or at the opposite spatial location. Participants responded to target words based on affective valence (i.e., positive/negative) in Experiment 1 and on letter case (lowercase/uppercase) in Experiment 2. Results showed that participants responded much faster to targets presented at the spatial location looked at by disgusted or fearful faces but only in Experiment 1, when an evaluative task was used. The present findings clearly show that negative facial expressions enhance the attentional shifts due to eye-gaze direction, provided that there was an explicit evaluative goal present. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
While humans are adept at recognizing emotional states conveyed by facial expressions, the current literature suggests that they lack accurate metacognitions about their performance in this domain. This finding comes from global trait-based questionnaires that assess the extent to which an individual perceives him or herself as empathic, as compared to other people. Those who rate themselves as empathically accurate are no better than others at recognizing emotions. Metacognition of emotion recognition can also be assessed using relative measures that evaluate how well a person thinks s/he has understood the emotion in a particular facial display as compared to other displays. While this is the most common method of metacognitive assessment of people's judgments of learning or their feelings of knowing, this kind of metacognition—“relative meta-accuracy”—has not been studied within the domain of emotion. As well as asking for global metacognitive judgments, we asked people to provide relative, trial-by-trial prospective and retrospective judgments concerning whether they would be right or wrong in recognizing the expressions conveyed in particular facial displays. Our question was: Do people know when they will be correct in knowing what expression is conveyed, and do they know when they do not know? Although we, like others, found that global meta-accuracy was unpredictive of performance, relative meta-accuracy, given by the correlation between participants' trial-by-trial metacognitive judgments and performance on each item, were highly accurate both on the Mind in the Eyes task (Experiment 1) and on the Ekman Emotional Expression Multimorph task (in Experiment 2). (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
Theories of embodied cognition hold that higher cognitive processes operate on perceptual symbols and that concept use involves partial reactivations of the sensory-motor states that occur during experience with the world. On this view, the processing of emotion knowledge involves a (partial) reexperience of an emotion, but only when access to the sensory basis of emotion knowledge is required by the task. In 2 experiments, participants judged emotional and neutral concepts corresponding to concrete objects (Experiment 1) and abstract states (Experiment 2) while facial electromyographic activity was recorded from the cheek, brow, eye, and nose regions. Results of both studies show embodiment of specific emotions in an emotion-focused but not a perceptual-focused processing task on the same words. A follow up in Experiment 3, which blocked selective facial expressions, suggests a causal, rather than simply a correlational, role for embodiment in emotion word processing. Experiment 4, using a property generation task, provided support for the conclusion that emotions embodied in conceptual tasks are context-dependent situated simulations rather than associated emotional reactions. Implications for theories of embodied simulation and for emotion theories are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Anticipation of others' actions is of paramount importance in social interactions. Cues such as gaze direction and facial expressions can be informative, but can also produce ambiguity with respect to others' intentions. We investigated the combined effect of an actor's gaze and expression on judgments made by observers about the end-point of the actor's head rotation toward the observer. Expressions of approach gave rise to an unambiguous intention to move toward the observer, while expressions of avoidance gave rise to an ambiguous behavioral intention (as the expression and motion cues were in conflict). In the ambiguous condition, observers overestimated how far the actor's head had rotated when the actor's gaze was directed ahead of head rotation (compared to congruent or lagging behind). In the unambiguous condition the estimations were not influenced by the gaze manipulation. These results show that social cue integration does not follow simple additive rules, and suggests that the involuntary allocation of attention to another's gaze depends on the perceived ambiguity of the agent's behavioral intentions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

8.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

9.
The direction of another person's gaze is difficult to ignore when presented at the center of attention. In 6 experiments, perception of unattended gaze was investigated. Participants made directional (left-right) judgments to gazing-face or pointing-hand targets, which were accompanied by a distractor face or hand. Processing of the distractor was assessed via congruency effects on target response times. Congruency effects were found from the direction of distractor hands but not from the direction of distractor gazes (Experiment 1). This pattern persisted even when distractor sizes were increased to compensate for their peripheral presentation (Experiments 2 and 5). In contrast, congruency effects were exerted by profile heads (Experiments 3 and 4). In Experiment 6, isolated eye region distractors produced no congruency effects, even when they were presented near the target. These results suggest that, unlike other facial information, gaze direction cannot be perceived outside the focus of attention. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Three experiments evaluated whether facial expression can modulate the allocation of focused attention. Identification of emotionally expressive target faces was typically faster when they were flanked by identical (compatible) faces compared with when they were flanked by different (incompatible) faces. This flanker compatibility effect was significantly smaller when target faces expressed negative compared with positive emotion (see Experiment 1A); however, when the faces were altered to disrupt emotional expression, yet retain feature differences, equal flanker compatibility effects were observed (see Experiment 1B). The flanker-compatibility effect was also found to be smaller for negative target faces compared compatibility with neutral target faces, and for both negative and neutral target faces compared with positive target faces (see Experiment 2). These results suggest that the constriction of attention is influenced by facial expressions of emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
In 6 experiments, the authors investigated whether attention orienting by gaze direction is modulated by the emotional expression (neutral, happy, angry, or fearful) on the face. The results showed a clear spatial cuing effect by gaze direction but no effect by facial expression. In addition, it was shown that the cuing effect was stronger with schematic faces than with real faces, that gaze cuing could be achieved at very short stimulus onset asynchronies (14 ms), and that there was no evidence for a difference in the strength of cuing triggered by static gaze cues and by cues involving apparent motion of the pupils. In sum, the results suggest that in normal, healthy adults, eye direction processing for attention shifts is independent of facial expression analysis. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Impaired facial expression recognition has been associated with features of major depression, which could underlie some of the difficulties in social interactions in these patients. Patients with major depressive disorder and age- and gender-matched healthy volunteers judged the emotion of 100 facial stimuli displaying different intensities of sadness and happiness and neutral expressions presented for short (100 ms) and long (2,000 ms) durations. Compared with healthy volunteers, depressed patients demonstrated subtle impairments in discrimination accuracy and a predominant bias away from the identification as happy of mildly happy expressions. The authors suggest that, in depressed patients, the inability to accurately identify subtle changes in facial expression displayed by others in social situations may underlie the impaired interpersonal functioning. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
How the processing of emotional expression is influenced by perceived gaze remains a debated issue. Discrepancies between previous results may stem from differences in the nature of stimuli and task characteristics. Here we used a highly controlled set of computer-generated animated faces combining dynamic emotional expressions with varying intensity, and gaze shifts either directed at or averted from the observer. We predicted that perceived self-relevance of fearful faces would be higher with averted gaze—signaling a nearby danger; whereas conversely, direct gaze would be more relevant for angry faces—signaling aggressiveness. This interaction pattern was observed behaviorally for emotion intensity ratings, and neurally for functional magnetic resonance imaging activation in amygdala, as well as fusiform and medial prefrontal cortices, but only for mild- and not high-intensity expressions. These results support an involvement of human amygdala in the appraisal of self-relevance and reveal a crucial role of expression intensity in emotion and gaze interactions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Two experiments examined the effects of encoding operations on forced-choice recognition memory for upright and inverted photographs of faces. In Experiment 1, with distractors closely matched to targets, performance was better on upright than on inverted faces, but was unaffected by whether subjects judged faces for distinctive features, distinctive traits or distinctive expressions. In Experiment 2, where distractors were either absent or weakly matched to distractors, accuracy was again higher on upright than on inverted faces, and was similar for the three encoding operations on upright faces. In contrast, it was poorer for distinctive expression judgments than for distinctive feature or for distinctive trait judgments on inverted faces. These results support Winograd's (1981) claim that distinctive feature and distinctive trait judgments both lead to the isolation of distinctive features. However, it was argued that distinctive expression judgments led to configural processing that was disrupted by inversion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Studied the extent to which communication channel affects judgments of the type and authenticity of emotions. 80 university students (mean age 21.5 yrs) were presented with short audio, video, and audiovideo excerpts of actors expressing specific emotions. In some cases, the emotion was actually experienced by the actor; in other cases, the emotion was simulated. Ss were distributed over 8 communication channel conditions (i.e., facial, audio, filtered audio, gestural?+?facial, facial?+?filtered audio, facial?+?audio, gestural?+?facial?+?filtered audio, and gestural?+?facial?+?audio) and asked to judge the emotional category (i.e., happiness, fear, anger, surprise, and sadness) and the authenticity of the emotion. The accuracy of the judgments was analyzed in relation to the type of channel and the type of emotion. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Three experiments were conducted to assess the automaticity of event frequency processing. Using a modified concept-learning task, Experiment 1 showed that intentional frequency processing led to more accurate frequency judgments than incidental processing. Experiment 2 demonstrated that nonspecific (general memory) instructions in incidental processing conditions can actually lead to subjects' intentional processing of frequency information, which undermines the effectiveness of an intentionality manipulation. And, in Experiment 3, frequency processing accuracy was found to be interfered with by concurrent cover task capacity requirements, even though frequency processing occurred incidentally. The findings that frequency judgment is influenced by intentionality and by concurrent task factors clearly violate two of Hasher and Zacks' (1979, 1984) empirical criteria used to define automatic processing; they also challenge the assumption that automatic processing is always optimal. In light of our and others' data, either event frequency, the prototypical automatic process, is not automatic, or the assumption that a process must be optimal if it is to be considered automatic must be dropped. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Objective: Individuals with schizophrenia have difficulty interpreting social and emotional cues such as facial expression, gaze direction, body position, and voice intonation. Nonverbal cues are powerful social signals but are often processed implicitly, outside the focus of attention. The aim of this research was to assess implicit processing of social cues in individuals with schizophrenia. Method: Patients with schizophrenia or schizoaffective disorder and matched controls performed a primary task of word classification with social cues in the background. Participants were asked to classify target words (LEFT/RIGHT) by pressing a key that corresponded to the word, in the context of facial expressions with eye gaze averted to the left or right. Results: Although facial expression and gaze direction were irrelevant to the task, these facial cues influenced word classification performance. Participants were slower to classify target words (e.g., LEFT) that were incongruent to gaze direction (e.g., eyes averted to the right) compared to target words (e.g., LEFT) that were congruent to gaze direction (e.g., eyes averted to the left), but this only occurred for expressions of fear. This pattern did not differ for patients and controls. Conclusion: The results showed that threat-related signals capture the attention of individuals with schizophrenia. These data suggest that implicit processing of eye gaze and fearful expressions is intact in schizophrenia. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号