首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
Gaze direction influences younger adults' perception of emotional expressions, with direct gaze enhancing the perception of anger and joy, while averted gaze enhances the perception of fear. Age-related declines in emotion recognition and eye-gaze processing have been reported, indicating that there may be age-related changes in the ability to integrate these facial cues. As there is evidence of a positivity bias with age, age-related difficulties integrating these cues may be greatest for negative emotions. The present research investigated age differences in the extent to which gaze direction influenced explicit perception (e.g., anger, fear and joy; Study 1) and social judgments (e.g., of approachability; Study 2) of emotion faces. Gaze direction did not influence the perception of fear in either age group. In both studies, age differences were found in the extent to which gaze direction influenced judgments of angry and joyful faces, with older adults showing less integration of gaze and emotion cues than younger adults. Age differences were greatest when interpreting angry expressions. Implications of these findings for older adults' social functioning are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
It has been suggested that despite explicit recognition difficulties, implicit processing of facial expressions may be preserved in older adulthood. To directly test this possibility, the authors used facial electromyography to assess older (N = 40) and young (N = 46) adults’ mimicry responses to angry and happy facial expressions, which were presented subliminally via a backward masking technique. The results indicated that despite not consciously perceiving the facial emotion stimuli, both groups mimicked the angry and happy facial expressions. Implications for emotion recognition difficulties in late adulthood are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Investigated the degree to which 4–5 yr olds (n?=?48) can enact expressions of emotion recognizable by peers and adults; the study also examined whether accuracy of recognition was a function of age and whether the expression was posed or spontaneous. Adults (n?=?103) were much more accurate than children in recognizing neutral states, slightly more accurate in recognizing happiness and anger, and equally accurate in recognizing sadness. Children's spontaneous displays of happiness were more recognizable than posed displays, but for other emotions there was no difference between the recognizability of posed and spontaneous expressions. Children were highly accurate in identifying the facial expressions of happiness, sadness, and anger displayed by their peers. Sex and ethnicity of the child whose emotion was displayed interacted to influence only adults' recognizability of anger. Results are discussed in terms of the social learning and cognitive developmental factors influencing (a) adults' and children's decoding (recognition) of emotional expressions in young children and (b) encoding (posing) of emotional expressions by young children. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Although positive and negative images enhance the visual processing of young adults, recent work suggests that a life-span shift in emotion processing goals may lead older adults to avoid negative images. To examine this tendency for older adults to regulate their intake of negative emotional information, the current study investigated age-related differences in the perceptual boost received by probes appearing over facial expressions of emotion. Visually-evoked event-related potentials were recorded from the scalp over cortical regions associated with visual processing as a probe appeared over facial expressions depicting anger, sadness, happiness, or no emotion. The activity of the visual system in response to each probe was operationalized in terms of the P1 component of the event-related potentials evoked by the probe. For young adults, the visual system was more active (i.e., greater P1 amplitude) when the probes appeared over any of the emotional facial expressions. However, for older adults, the visual system displayed reduced activity when the probe appeared over angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

8.
Experimental studies indicate that recognition of emotions, particularly negative emotions, decreases with age. However, there is no consensus at which age the decrease in emotion recognition begins, how selective this is to negative emotions, and whether this applies to both facial and vocal expression. In the current cross-sectional study, 607 participants ranging in age from 18 to 84 years (mean age = 32.6 ± 14.9 years) were asked to recognize emotions expressed either facially or vocally. In general, older participants were found to be less accurate at recognizing emotions, with the most distinctive age difference pertaining to a certain group of negative emotions. Both modalities revealed an age-related decline in the recognition of sadness and—to a lesser degree—anger, starting at about 30 years of age. Although age-related differences in the recognition of expression of emotion were not mediated by personality traits, 2 of the Big 5 traits, openness and conscientiousness, made an independent contribution to emotion-recognition performance. Implications of age-related differences in facial and vocal emotion expression and early onset of the selective decrease in emotion recognition are discussed in terms of previous findings and relevant theoretical models. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Studied the development of the recognition of emotional facial expressions in children and of the factors influencing recognition accuracy. 80 elementary school students (aged 5–8 yrs) were asked to identify the emotions expressed in a series of facial photographs. Recognition performances were analyzed in relation to the type of emotion expressed (i.e., happiness, fear, anger, surprise, sadness, or disgust) and the intensity of the emotional expression. Age differences were determined. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
16 children were videotaped at 13 and 18 mo of age during the strange-situation procedure (M. D. Ainsworth et al, 1978). Facial expressions (interest, anger, sadness, and emotion blends) during the 2nd separation episode were coded using a system for identifying affect expressions by holistic judgments (Affex) developed by the 2nd author and colleagues (1980). Results show significant continuities in proportion of interest expressions, anger, emotion blends and frequency of expression changes. The major developmental change was seen in an age?×?emotion interaction, showing an increase in the use of facial expression blends or combinations from 13 to 18 mo. Results support the belief that patterns of emotion reflect early, persistent individual differences; they also reflect a developmental trend toward increasing complexity of emotional responses. (16 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
This investigation represents a multimodal study of age-related differences in experienced and expressed affect and in emotion regulatory skills in a sample of young, middle-aged, and older adults (N = 96), testing formulations derived from differential emotions theory. The experimental session consisted of a 10-min anger induction and a 10-min sadness induction using a relived emotion task; participants were also randomly assigned to an inhibition or noninhibition condition. In addition to subjective ratings of emotional experience provided by participants, their facial behavior was coded using an objective facial affect coding system; a content analysis also was applied to the emotion narratives. Separate repeated measures analyses of variance applied to each emotion domain indicated age differences in the co-occurrence of negative emotions and co-occurrence of positive and negative emotions across domains, thus extending the finding of emotion heterogeneity or complexity in emotion experience to facial behavior and verbal narratives. The authors also found that the inhibition condition resulted in a different pattern of results in the older versus middle-aged and younger adults. The intensity and frequency of discrete emotions were similar across age groups, with a few exceptions. Overall, the findings were generally consistent with differential emotions theory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
The ability to perceive and interpret facial expressions of emotion improves throughout childhood. Although newborns have rudimentary perceptive abilities allowing them to distinguish several facial expressions, it is only at the end of the first year that infants seem to be able to assign meaning to emotional signals. The meaning infants assign to facial expressions is very broad, as it is limited to the judgment of emotional valence. Meaning becomes more specific between the second and the third year of life, as children begin to categorize facial signals in terms of discrete emotions. While the facial expressions of happiness, anger and sadness are accurately categorized by the third year, the categorization of expressions of fear, surprise and disgust shows a much slower developmental pattern. Moreover, the ability to judge the sincerity of facial expressions shows a slower developmental pattern, probably because of the subtle differences between genuine and non-genuine expressions. The available evidence indicates that school age children can distinguish genuine smiles from masked smiles and false smiles. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
The authors administered social cognition tasks to younger and older adults to investigate age-related differences in social and emotional processing. Although slower, older adults were as accurate as younger adults in identifying the emotional valence (i.e., positive, negative, or neutral) of facial expressions. However, the age difference in reaction time was largest for negative faces. Older adults were significantly less accurate at identifying specific facial expressions of fear and sadness. No age differences specific to social function were found on tasks of self-reference, identifying emotional words, or theory of mind. Performance on the social tasks in older adults was independent of performance on general cognitive tasks (e.g., working memory) but was related to personality traits and emotional awareness. Older adults also showed more intercorrelations among the social tasks than did the younger adults. These findings suggest that age differences in social cognition are limited to the processing of facial emotion. Nevertheless, with age there appears to be increasing reliance on a common resource to perform social tasks, but one that is not shared with other cognitive domains. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Very few large-scale studies have focused on emotional facial expression recognition (FER) in 3-year-olds, an age of rapid social and language development. We studied FER in 808 healthy 3-year-olds using verbal and nonverbal computerized tasks for four basic emotions (happiness, sadness, anger, and fear). Three-year-olds showed differential performance on the verbal and nonverbal FER tasks, especially with respect to fear. That is to say, fear was one of the most accurately recognized facial expressions as matched nonverbally and the least accurately recognized facial expression as labeled verbally. Sex did not influence emotion-matching nor emotion-labeling performance after adjusting for basic matching or labeling ability. Three-year-olds made systematic errors in emotion-labeling. Namely, happy expressions were often confused with fearful expressions, whereas negative expressions were often confused with other negative expressions. Together, these findings suggest that 3-year-olds' FER skills strongly depend on task specifications. Importantly, fear was the most sensitive facial expression in this regard. Finally, in line with previous studies, we found that recognized emotion categories are initially broad, including emotions of the same valence, as reflected in the nonrandom errors of 3-year-olds. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

15.
This study investigated the identification of facial expressions of emotion in currently nondepressed participants who had a history of recurrent depressive episodes (recurrent major depression; RMD) and never-depressed control participants (CTL). Following a negative mood induction, participants were presented with faces whose expressions slowly changed from neutral to full intensity. Identification of facial expressions was measured by the intensity of the expression at which participants could accurately identify whether faces expressed happiness, sadness, or anger. There were no group differences in the identification of sad or angry expressions. Compared with CTL participants, however, RMD participants required significantly greater emotional intensity in the faces to correctly identify happy expressions. These results indicate that biases in the processing of emotional facial expressions are evident even after individuals have recovered from a depressive episode. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Facial expressions of emotion are key cues to deceit (M. G. Frank & P. Ekman, 1997). Given that the literature on aging has shown an age-related decline in decoding emotions, we investigated (a) whether there are age differences in deceit detection and (b) if so, whether they are related to impairments in emotion recognition. Young and older adults (N = 364) were presented with 20 interviews (crime and opinion topics) and asked to decide whether each interview subject was lying or telling the truth. There were 3 presentation conditions: visual, audio, or audiovisual. In older adults, reduced emotion recognition was related to poor deceit detection in the visual condition for crime interviews only. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
People with Huntington's disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

18.
Identification of other people's emotion from quickly presented stimuli, including facial expressions, is fundamental to many social processes, including rapid mimicry and empathy. This study examined extraction of valence from brief emotional expressions in adults with autism spectrum disorder (ASD), a condition characterized by impairments in understanding and sharing of emotions. Control participants were individuals with reading disability and typical individuals. Participants were shown images for durations in the range of microexpressions (15 ms and 30 ms), thus reducing the reliance on higher level cognitive skills. Participants detected whether (a) emotional faces were happy or angry, (b) neutral faces were male or female, and (c) neutral images were animals or objects. Individuals with ASD performed selectively worse on emotion extraction, with no group differences for gender or animal?object tasks. The emotion extraction deficit remains even when controlling for gender, verbal ability, and age and is not accounted for by speed-accuracy tradeoffs. The deficit in rapid emotional processing may contribute to ASD difficulties in mimicry, empathy, and related processes. The results highlight the role of rapid early emotion processing in adaptive social?emotional functioning. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
In this study I used a temporal bisection task to test if greater overestimation of time due to negative emotion is moderated by individual differences in negative emotionality. The effects of fearful facial expressions on time perception were also examined. After a training phase, participants estimated the duration of facial expressions (anger, happiness, fearfulness) and a neutral-baseline facial expression. In accordance to the operation of an arousal-based process, the duration of angry expressions was consistently overestimated relative to other expressions and the baseline condition. In support of a role for individual differences in negative emotionality on time perception, temporal bias due to angry and fearful expressions was positively correlated to individual differences in self-reported negative emotionality. The results are discussed in relation both to the literature on attentional bias to facial expressions in anxiety and fearfulness and also, to the hypothesis that angry expressions evoke a fear-specific response. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Reports an error in "Affect bursts: Dynamic patterns of facial expression" by Eva G. Krumhuber and Klaus R. Scherer (Emotion, 2011, np). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum. (The following abstract of the original article appeared in record 2011-12872-001.) Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号