首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We investigated facial recognition memory (for previously unfamiliar faces) and facial expression perception with functional magnetic resonance imaging (fMRI). Eight healthy, right-handed volunteers participated. For the facial recognition task, subjects made a decision as to the familiarity of each of 50 faces (25 previously viewed; 25 novel). We detected signal increase in the right middle temporal gyrus and left prefrontal cortex during presentation of familiar faces, and in several brain regions, including bilateral posterior cingulate gyri, bilateral insulae and right middle occipital cortex during presentation of unfamiliar faces. Standard facial expressions of emotion were used as stimuli in two further tasks of facial expression perception. In the first task, subjects were presented with alternating happy and neutral faces; in the second task, subjects were presented with alternating sad and neutral faces. During presentation of happy facial expressions, we detected a signal increase predominantly in the left anterior cingulate gyrus, bilateral posterior cingulate gyri, medial frontal cortex and right supramarginal gyrus, brain regions previously implicated in visuospatial and emotion processing tasks. No brain regions showed increased signal intensity during presentation of sad facial expressions. These results provide evidence for a distinction between the neural correlates of facial recognition memory and perception of facial expression but, whilst highlighting the role of limbic structures in perception of happy facial expressions, do not allow the mapping of a distinct neural substrate for perception of sad facial expressions.  相似文献   

2.
Theories of embodied cognition hold that higher cognitive processes operate on perceptual symbols and that concept use involves partial reactivations of the sensory-motor states that occur during experience with the world. On this view, the processing of emotion knowledge involves a (partial) reexperience of an emotion, but only when access to the sensory basis of emotion knowledge is required by the task. In 2 experiments, participants judged emotional and neutral concepts corresponding to concrete objects (Experiment 1) and abstract states (Experiment 2) while facial electromyographic activity was recorded from the cheek, brow, eye, and nose regions. Results of both studies show embodiment of specific emotions in an emotion-focused but not a perceptual-focused processing task on the same words. A follow up in Experiment 3, which blocked selective facial expressions, suggests a causal, rather than simply a correlational, role for embodiment in emotion word processing. Experiment 4, using a property generation task, provided support for the conclusion that emotions embodied in conceptual tasks are context-dependent situated simulations rather than associated emotional reactions. Implications for theories of embodied simulation and for emotion theories are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Extant research suggests that targets' emotion expressions automatically evoke similar affect in perceivers. The authors hypothesized that the automatic impact of emotion expressions depends on group membership. In Experiments 1 and 2, an affective priming paradigm was used to measure immediate and preconscious affective responses to same-race or other-race emotion expressions. In Experiment 3, spontaneous vocal affect was measured as participants described the emotions of an ingroup or outgroup sports team fan. In these experiments, immediate and spontaneous affective responses depended on whether the emotional target was ingroup or outgroup. Positive responses to fear expressions and negative responses to joy expressions were observed in outgroup perceivers, relative to ingroup perceivers. In Experiments 4 and 5, discrete emotional responses were examined. In a lexical decision task (Experiment 4), facial expressions of joy elicited fear in outgroup perceivers, relative to ingroup perceivers. In contrast, facial expressions of fear elicited less fear in outgroup than in ingroup perceivers. In Experiment 5, felt dominance mediated emotional responses to ingroup and outgroup vocal emotion. These data support a signal-value model in which emotion expressions signal environmental conditions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Two incompatible pictures compete for perceptual dominance when they are presented to one eye each. This so-called binocular rivalry results in an alternation of dominant and suppressed percepts. In accordance with current theories of emotion processing, the authors' previous research has suggested that emotionally arousing pictures predominate in this perceptual process. Three experiments were run with pictures of emotional facial expressions that are known to induce emotions while being well controlled in terms of physical characteristics. In Experiment 1, photographs of emotional and neutral facial expressions were presented of the same actor to minimize physical differences. In Experiment 2, schematic emotional expressions were presented to further eliminate low-level differences. In Experiment 3, a probe-detection task was conducted to control for possible response-biases. Together, these data clearly demonstrate that emotional facial expressions predominate over neutral expressions; they are more often the first percept and they are perceived for longer durations. This is not caused by physical stimulus properties or by response-biases. This novel approach supports that emotionally significant visual stimuli are preferentially perceived. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Research has largely neglected the effects of gaze direction cues on the perception of facial expressions of emotion. It was hypothesized that when gaze direction matches the underlying behavioral intent (approach-avoidance) communicated by an emotional expression, the perception of that emotion would be enhanced (i.e., shared signal hypothesis). Specifically, the authors expected that (a) direct gaze would enhance the perception of approach-oriented emotions (anger and joy) and (b) averted eye gaze would enhance the perception of avoidance-oriented emotions (fear and sadness). Three studies supported this hypothesis. Study 1 examined emotional trait attributions made to neutral faces. Study 2 examined ratings of ambiguous facial blends of anger and fear. Study 3 examined the influence of gaze on the perception of highly prototypical expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
While humans are adept at recognizing emotional states conveyed by facial expressions, the current literature suggests that they lack accurate metacognitions about their performance in this domain. This finding comes from global trait-based questionnaires that assess the extent to which an individual perceives him or herself as empathic, as compared to other people. Those who rate themselves as empathically accurate are no better than others at recognizing emotions. Metacognition of emotion recognition can also be assessed using relative measures that evaluate how well a person thinks s/he has understood the emotion in a particular facial display as compared to other displays. While this is the most common method of metacognitive assessment of people's judgments of learning or their feelings of knowing, this kind of metacognition—“relative meta-accuracy”—has not been studied within the domain of emotion. As well as asking for global metacognitive judgments, we asked people to provide relative, trial-by-trial prospective and retrospective judgments concerning whether they would be right or wrong in recognizing the expressions conveyed in particular facial displays. Our question was: Do people know when they will be correct in knowing what expression is conveyed, and do they know when they do not know? Although we, like others, found that global meta-accuracy was unpredictive of performance, relative meta-accuracy, given by the correlation between participants' trial-by-trial metacognitive judgments and performance on each item, were highly accurate both on the Mind in the Eyes task (Experiment 1) and on the Ekman Emotional Expression Multimorph task (in Experiment 2). (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

7.
Difficulties in understanding emotional signals might have important implications for social interactions in old age. In this study we investigated emotion perception skills involved in decoding facial expressions of emotion in healthy older adults, compared with those with Alzheimer’s disease (AD) or late-life mood disorder (MD). Although those with MD were mildly impaired in identifying emotional expressions, this was not caused by negative biases in choosing labels. Emotion decoding performance in AD was much more impaired, particularly when relatively subtle expressions were presented. Difficulties in choosing between labels to describe an emotional face were predicted by executive dysfunction, whereas impaired ability to match 2 emotional faces was related to general difficulties with face perception. Across all 3 groups, problems with emotion perception predicted quality of life independently of variance predicted by cognitive function and mood, indicating the potential importance of emotion decoding skills in the well-being of older adults. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Three experiments tested the hypothesis that explaining emotional expressions using specific emotion concepts at encoding biases perceptual memory for those expressions. In Experiment 1, participants viewed faces expressing blends of happiness and anger and created explanations of why the target people were expressing one of the two emotions, according to concepts provided by the experimenter. Later, participants attempted to identify the facial expressions in computer movies, in which the previously seen faces changed continuously from anger to happiness. Faces conceptualized in terms of anger were remembered as angrier than the same faces conceptualized in terms of happiness, regardless of whether the explanations were told aloud or imagined. Experiments 2 and 3 showed that explanation is necessary for the conceptual biases to emerge fully and extended the finding to anger-sad expressions, an emotion blend more common in real life. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Emotion processing deficits may have an important effect on the quality of life of Alzheimer's disease (AD) patients and their families, yet there are few studies in this area and little is known about the cause of such deficits in AD. This study sought to determine if some AD patients have a disruption in a specific right hemisphere emotion processing system, and to determine if the processing of emotional facial expression is more vulnerable to the pathology of AD than is the perception of emotional prosody. It was specifically hypothesized that patients with greater right hemisphere dysfunction (low spatial AD patients) would be impaired on emotion processing tasks relative to those with predominantly left hemisphere dysfunction (low verbal AD patients). Both groups showed impairment on emotion processing tasks but for different reasons. The low verbal patients performed poorly on the affect processing measures because they had difficulty comprehending and/or remembering the task instructions. In contrast, low spatial AD patients have emotion processing deficits that are independent of language and/or memory and may be due to a more general visuoperceptual deficit that affects the perception of static but not dynamic affective stimuli.  相似文献   

10.
The contributions to the recognition of emotional signals of (a) experience and learning versus (b) internal predispositions are difficult to investigate because children are virtually always exposed to complex emotional experiences from birth. The recognition of emotion among physically abused and physically neglected preschoolers was assessed in order to examine the effects of atypical experience on emotional development. In Experiment 1, children matched a facial expression to an emotional situation. Neglected children had more difficulty discriminating emotional expressions than did control or physically abused children. Physically abused children displayed a response bias for angry facial expressions. In Experiment 2, children rated the similarity of facial expressions. Control children viewed discrete emotions as dissimilar, neglected children saw fewer distinctions between emotions, and physically abused children showed the most variance across emotions. These results suggest that to the extent that children's experience with the world varies, so too will their interpretation and understanding of emotional signals. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
We recently reported that patients who had received unilateral temporal lobectomy, including the amygdala and hippocampus, show impaired acquisition in a fear conditioning task (LaBar, LeDoux, Spencer, & Phelps, 1995), indicating a deficit in emotional memory. In the present paper, we examined performance of these patients on two verbal, emotional memory tasks in an effort to determine the extent of this deficit. In Experiment 1, subjects were asked to recall emotional and non-emotional words. In Experiment 2, subjects were asked to recall neutral words which were embedded in emotional and non-emotional sentence contexts. Both temporal lobectomy subjects and normal controls showed enhanced recall for emotional words (Experiment 1) and enhanced recall for neutral words embedded in emotional sentence contexts (Experiment 2). These results suggest that the deficit seen in emotional memory following unilateral temporal lobectomy is not a global deficit and may be limited to specific circumstances where emotion influences memory performance. Several hypotheses concerning the discrepancy between the present studies and the fear conditioning results (LaBar et al., 1995) are discussed.  相似文献   

12.
This research examined the relationship between individual differences in working memory capacity and the self-regulation of emotional expression and emotional experience. Four studies revealed that people higher in working memory capacity suppressed expressions of negative emotion (Study 1) and positive emotion (Study 2) better than did people lower in working memory capacity. Furthermore, compared to people lower in working memory capacity, people higher in capacity more capably appraised emotional stimuli in an unemotional manner and thereby experienced (Studies 3 and 4) and expressed (Study 4) less emotion in response to those stimuli. These findings indicate that cognitive ability contributes to the control of emotional responding. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
This research tested the hypothesis that initial efforts at executive control temporarily undermine subsequent efforts at executive control. Four experiments revealed that controlling the focus of visual attention (Experiment 1), inhibiting predominant writing tendencies (Experiment 2), taking a working memory test (Experiment 3), or exaggerating emotional expressions (Experiment 4) undermined performance on subsequent tests of working memory span, reverse digit span, and response inhibition, respectively. The results supported a limited resource model of executive control and cast doubt on competing accounts based on mood, motivation, or task difficulty. Prior efforts at executive control are a significant contextual determinant of the operation of executive processes. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
The amygdala is thought to play a crucial role in emotional and social behaviour. Animal studies implicate the amygdala in both fear conditioning and face perception. In humans, lesions of the amygdala can lead to selective deficits in the recognition of fearful facial expressions and impaired fear conditioning, and direct electrical stimulation evokes fearful emotional responses. Here we report direct in vivo evidence of a differential neural response in the human amygdala to facial expressions of fear and happiness. Positron-emission tomography (PET) measures of neural activity were acquired while subjects viewed photographs of fearful or happy faces, varying systematically in emotional intensity. The neuronal response in the left amygdala was significantly greater to fearful as opposed to happy expressions. Furthermore, this response showed a significant interaction with the intensity of emotion (increasing with increasing fearfulness, decreasing with increasing happiness). The findings provide direct evidence that the human amygdala is engaged in processing the emotional salience of faces, with a specificity of response to fearful facial expressions.  相似文献   

15.
Previous studies showing that schizophrenic patients have a deficit in the ability to perceive facial expressions of emotion in others often have not used a differential deficit design and standardized measures of emotion perception. Using standardized and cross-validated measures in a differential deficit design, S. L. Kerr and J. M. Neale (see record 1993-29687-001) found no evidence for a deficit specific to emotion perception among unmedicated schizophrenic patients. The present study replicated and extended the findings of Kerr and Neale in a sample of medicated schizophrenic patients. Results showed that medicated patients performed more poorly than controls overall; however, they performed no worse on facial emotion perception tasks than on a matched control task. These findings support Kerr and Neale's conclusion that schizophrenic patients do not have a differential deficit in facial emotion perception ability. Future research should examine the nature of schizophrenic patients generalized poor performance on tests of facial emotion perception. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Although the basal ganglia have been shown to be critical for the expression of emotion in prosody and facial expressions, it is unclear whether they are also critical for recognition of emotions. Selective pathology of parts of the basal ganglia is a hallmark of individuals with Parkinson's disease, and such patients have been examined in several studies of emotion. We examined 18 patients with Parkinson's disease (11 men, 7 women) and 13 age-, education-, gender ratio-, and IQ-matched normal controls on their ability to recognize emotions signaled by facial expressions. Parkinson's patients performed entirely normally on a quantitative task of recognizing emotional facial expressions. The findings do not support the notion that the sectors of basal ganglia that are dysfunctional in Parkinson's disease are essential for recognizing emotion in facial expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
The authors administered social cognition tasks to younger and older adults to investigate age-related differences in social and emotional processing. Although slower, older adults were as accurate as younger adults in identifying the emotional valence (i.e., positive, negative, or neutral) of facial expressions. However, the age difference in reaction time was largest for negative faces. Older adults were significantly less accurate at identifying specific facial expressions of fear and sadness. No age differences specific to social function were found on tasks of self-reference, identifying emotional words, or theory of mind. Performance on the social tasks in older adults was independent of performance on general cognitive tasks (e.g., working memory) but was related to personality traits and emotional awareness. Older adults also showed more intercorrelations among the social tasks than did the younger adults. These findings suggest that age differences in social cognition are limited to the processing of facial emotion. Nevertheless, with age there appears to be increasing reliance on a common resource to perform social tasks, but one that is not shared with other cognitive domains. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
In this study I used a temporal bisection task to test if greater overestimation of time due to negative emotion is moderated by individual differences in negative emotionality. The effects of fearful facial expressions on time perception were also examined. After a training phase, participants estimated the duration of facial expressions (anger, happiness, fearfulness) and a neutral-baseline facial expression. In accordance to the operation of an arousal-based process, the duration of angry expressions was consistently overestimated relative to other expressions and the baseline condition. In support of a role for individual differences in negative emotionality on time perception, temporal bias due to angry and fearful expressions was positively correlated to individual differences in self-reported negative emotionality. The results are discussed in relation both to the literature on attentional bias to facial expressions in anxiety and fearfulness and also, to the hypothesis that angry expressions evoke a fear-specific response. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
The interaction between emotion and working memory maintenance, load, and performance has been investigated with mixed results. The effect of emotion on specific executive processes such as interference resolution, however, remains relatively unexplored. In this series of studies, we examine how emotion affects interference resolution processes within working memory by modifying the Recency-probes paradigm (Monsel, 1978) to include emotional and neutral stimuli. Reaction time differences were compared between interference and non-interference trials for neutral and emotional words (Studies 1 & 3) and pictures (Study 2). Our results indicate that trials using emotional stimuli show a relative decrease in interference compared with trials using neutral stimuli, suggesting facilitation of interference resolution in the former. Furthermore, both valence and arousal seem to interact to produce this facilitation effect. These findings suggest that emotion facilitates response selection amid interference in working memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
It has been proposed that self-face representations are involved in interpreting facial emotions of others. We experimentally primed participants' self-face representations. In Study 1, we assessed eye tracking patterns and performance on a facial emotion discrimination task, and in Study 2, we assessed emotion ratings between self and nonself groups. Results show that experimental priming of self-face representations increases visual exploration of faces, facilitates the speed of facial expression processing, and increases the emotional distance between expressions. These findings suggest that the ability to interpret facial expressions of others is intimately associated with the representations we have of our own faces. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号