首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Efficient navigation of our social world depends on the generation, interpretation, and combination of social signals within different sensory systems. However, the influence of healthy adult aging on multisensory integration of emotional stimuli remains poorly explored. This article comprises 2 studies that directly address issues of age differences on cross-modal emotional matching and explicit identification. The first study compared 25 younger adults (19–40 years) and 25 older adults (60–80 years) on their ability to match cross-modal congruent and incongruent emotional stimuli. The second study looked at performance of 20 younger (19–40) and 20 older adults (60–80) on explicit emotion identification when information was presented congruently in faces and voices or only in faces or in voices. In Study 1, older adults performed as well as younger adults on tasks in which congruent auditory and visual emotional information were presented concurrently, but there were age-related differences in matching incongruent cross-modal information. Results from Study 2 indicated that though older adults were impaired at identifying emotions from 1 modality (faces or voices alone), they benefited from congruent multisensory information as age differences were eliminated. The findings are discussed in relation to social, emotional, and cognitive changes with age. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Individuals with borderline personality disorder (BPD) have been hypothesized to exhibit significant problems associated with emotional sensitivity. The current study examined emotional sensitivity (i.e., low threshold for recognition of emotional stimuli) in BPD by comparing 20 individuals with BPD and 20 normal controls on their accuracy in identifying emotional expressions. Results demonstrated that, as facial expressions morphed from neutral to maximum intensity, participants with BPD correctly identified facial affect at an earlier stage than did healthy controls. Participants with BPD were more sensitive than healthy controls in identifying emotional expressions in general, regardless of valence. These findings could not be explained by participants with BPD responding faster with more errors. Overall, results appear to support the contention that heightened emotional sensitivity may be a core feature of BPD. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
This study investigates the discrimination accuracy of emotional stimuli in subjects with major depression compared with healthy controls using photographs of facial expressions of varying emotional intensities. The sample included 88 unmedicated male and female subjects, aged 18–56 years, with major depressive disorder (n = 44) or no psychiatric illness (n = 44), who judged the emotion of 200 facial pictures displaying an expression between 10% (90% neutral) and 80% (nuanced) emotion. Stimuli were presented in 10% increments to generate a range of intensities, each presented for a 500-ms duration. Compared with healthy volunteers, depressed subjects showed very good recognition accuracy for sad faces but impaired recognition accuracy for other emotions (e.g., harsh, surprise, and sad expressions) of subtle emotional intensity. Recognition accuracy improved for both groups as a function of increased intensity on all emotions. Finally, as depressive symptoms increased, recognition accuracy increased for sad faces, but decreased for surprised faces. Moreover, depressed subjects showed an impaired ability to accurately identify subtle facial expressions, indicating that depressive symptoms influence accuracy of emotional recognition. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Three experiments tested the hypothesis that explaining emotional expressions using specific emotion concepts at encoding biases perceptual memory for those expressions. In Experiment 1, participants viewed faces expressing blends of happiness and anger and created explanations of why the target people were expressing one of the two emotions, according to concepts provided by the experimenter. Later, participants attempted to identify the facial expressions in computer movies, in which the previously seen faces changed continuously from anger to happiness. Faces conceptualized in terms of anger were remembered as angrier than the same faces conceptualized in terms of happiness, regardless of whether the explanations were told aloud or imagined. Experiments 2 and 3 showed that explanation is necessary for the conceptual biases to emerge fully and extended the finding to anger-sad expressions, an emotion blend more common in real life. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Two studies tested the hypothesis that in judging people's emotions from their facial expressions, Japanese, more than Westerners, incorporate information from the social context. In Study 1, participants viewed cartoons depicting a happy, sad, angry, or neutral person surrounded by other people expressing the same emotion as the central person or a different one. The surrounding people's emotions influenced Japanese but not Westerners' perceptions of the central person. These differences reflect differences in attention, as indicated by eye-tracking data (Study 2): Japanese looked at the surrounding people more than did Westerners. Previous findings on East-West differences in contextual sensitivity generalize to social contexts, suggesting that Westerners see emotions as individual feelings, whereas Japanese see them as inseparable from the feelings of the group. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Facial expression is heralded as a communication system common to all human populations, and thus is generally accepted as a biologically based, universal behavior. Happiness, sadness, fear, anger, surprise, and disgust are universally recognized and produced emotions, and communication of these states is deemed essential in order to navigate the social environment. It is puzzling, however, how individuals are capable of producing similar facial expressions when facial musculature is known to vary greatly among individuals. Here, the authors show that although some facial muscles are not present in all individuals, and often exhibit great asymmetry (larger or absent on one side), the facial muscles that are essential in order to produce the universal facial expressions exhibited 100% occurrence and showed minimal gross asymmetry in 18 cadavers. This explains how universal facial expression production is achieved, implies that facial muscles have been selected for essential nonverbal communicative function, and yet also accommodate individual variation. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Two studies provide evidence for the role of cultural familiarity in recognizing facial expressions of emotion. For Chinese located in China and the United States, Chinese Americans, and non-Asian Americans, accuracy and speed in judging Chinese and American emotions was greater with greater participant exposure to the group posing the expressions. Likewise, Tibetans residing in China and Africans residing in the United States were faster and more accurate when judging emotions expressed by host versus nonhost society members. These effects extended across generations of Chinese Americans, seemingly independent of ethnic or biological ties. Results suggest that the universal affect system governing emotional expression may be characterized by subtle differences in style across cultures, which become more familiar with greater cultural contact. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
How the processing of emotional expression is influenced by perceived gaze remains a debated issue. Discrepancies between previous results may stem from differences in the nature of stimuli and task characteristics. Here we used a highly controlled set of computer-generated animated faces combining dynamic emotional expressions with varying intensity, and gaze shifts either directed at or averted from the observer. We predicted that perceived self-relevance of fearful faces would be higher with averted gaze—signaling a nearby danger; whereas conversely, direct gaze would be more relevant for angry faces—signaling aggressiveness. This interaction pattern was observed behaviorally for emotion intensity ratings, and neurally for functional magnetic resonance imaging activation in amygdala, as well as fusiform and medial prefrontal cortices, but only for mild- and not high-intensity expressions. These results support an involvement of human amygdala in the appraisal of self-relevance and reveal a crucial role of expression intensity in emotion and gaze interactions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
The role of embodiment in the perception of the duration of emotional stimuli was investigated with a temporal bisection task. Previous research has shown that individuals overestimate the duration of emotional, compared with neutral, faces (S. Droit-Volet, S. Brunot, & P. M. Niedenthal, 2004). The authors tested a role for embodiment in this effect. Participants estimated the duration of angry, happy, and neutral faces by comparing them to 2 durations learned during a training phase. Experimental participants held a pen in their mouths so as to inhibit imitation of the faces, whereas control participants could imitate freely. Results revealed that participants overestimated the duration of emotional faces relative to the neutral faces only when imitation was possible. Implications for the role of embodiment in emotional perception are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
This study examined how a major life stressor—the transition to parenthood—affects marital satisfaction and functioning among persons with different attachment orientations. As hypothesized, the interaction between women's degree of attachment ambivalence and their perceptions of spousal support (assessed 6 weeks prior to childbirth) predicted systematic changes in men's and women's marital satisfaction and related factors over time (6 months postpartum). Specifically, if highly ambivalent (preoccupied) women entered parenthood perceiving lower levels of support from their husbands, they experienced declines in marital satisfaction. Women's ambivalence also predicted their own as well as their husbands' marital satisfaction and functioning concurrently. The degree of attachment avoidance did not significantly predict marital changes, although women's avoidance did correlate with some of the concurrent marital measures. These findings are discussed in terms of attachment theory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Infants' responsiveness to others' affective expressions was investigated in the context of a peekaboo game. Forty 4-month-olds participated in a peekaboo game in which the typical happy/surprised expression was systematically replaced with a different emotion, depending on group assignment. Infants viewed three typical peekaboo trials followed by a change (anger, fear, or sadness) or no-change (happiness/surprise) trial, repeated over two blocks. Infants' looking time and affective responsiveness were measured. Results revealed differential patterns of visual attention and affective responsiveness to each emotion. These results underscore the importance of contextual information for facilitating recognition of emotion expressions as well as the efficacy of using converging measures to assess such understanding. Infants as young as 4 months appear to discriminate and respond in meaningful ways to others' emotion expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Despite the fact that facial expressions of emotion have signal value, there is surprisingly little research examining how that signal can be detected under various conditions, because most judgment studies utilize full-face, frontal views. We remedy this by obtaining judgments of frontal and profile views of the same expressions displayed by the same expressors. We predicted that recognition accuracy when viewing faces in profile would be lower than when judging the same faces from the front. Contrarily, there were no differences in recognition accuracy as a function of view, suggesting that emotions are judged equally well regardless of from what angle they are viewed. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

14.
There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

15.
Attachment researchers claim that individual differences in how adults talk about their early memories reflect qualitatively distinct organizations of emotion regarding childhood experiences with caregivers. Testing this assumption, the present study examined the relationship between attachment dimensions and physiological, facial expressive, as well as self-reported emotional responses during the Adult Attachment Interview (AAI). Consistent with theoretical predictions, more prototypically secure adults behaviorally expressed and reported experiencing emotion consistent with the valence of the childhood events they described. Insecure adults also showed distinctive and theoretically anticipated forms of emotional response: Dismissing participants evidenced increased electrodermal activity during the interview, a sign of emotional suppression, whereas preoccupied adults showed reliable discrepancies between the valence of their inferred childhood experiences and their facial expressive as well as reported emotion during the AAI. Results substantiate a case that the AAI reflects individual differences in emotion regulation that conceptually parallel observations of attachment relationships in infancy. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
The willingness to trust and self-disclose to others, key aspects in the decision to seek psychotherapy, is expected to vary across attachment classifications. The current study examined the association between internal working models of attachment and history of psychotherapy in a middle-class sample of 120 women, who were administered the Adult Attachment Interview (C. George, N. Kaplan, & M. Main, 1985/1996) and the Mental Health Survey (S. A. Riggs & D. Jacobvitz, 2002). Findings supported predictions that security of attachment is linked to history of psychotherapy. Specifically, adults classified as Dismissing were less likely than other adults to report a history of psychotherapy, whereas Secure adults reported the highest rates of couples therapy. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
In 2 studies, the authors developed and validated of a new set of standardized emotion expressions, which they referred to as the University of California, Davis, Set of Emotion Expressions (UCDSEE). The precise components of each expression were verified using the Facial Action Coding System (FACS). The UCDSEE is the first FACS-verified set to include the three “self-conscious” emotions known to have recognizable expressions (embarrassment, pride, and shame), as well as the 6 previously established “basic” emotions (anger, disgust, fear, happiness, sadness, and surprise), all posed by the same 4 expressers (African and White males and females). This new set has numerous potential applications in future research on emotion and related topics. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
The authors previously reported that normal subjects are better at discriminating happy from neutral faces when the happy face is located to the viewer's right of the neutral face; conversely, discrimination of sad from neutral faces is better when the sad face is shown to the left, supporting a role for the left hemisphere in processing positive valence and for the right hemisphere in processing negative valence. Here, the authors extend this same task to subjects with unilateral cerebral damage (31 right, 28 left). Subjects with right damage performed worse when discriminating sad faces shown on the left, consistent with the prior findings. However, subjects with either left or right damage actually performed superior to normal controls when discriminating happy faces shown on the left. The authors suggest that perception of negative valence relies preferentially on the right hemisphere, whereas perception of positive valence relies on both left and right hemispheres. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Prior studies provide consistent evidence of deficits for psychopaths in processing verbal emotional material but are inconsistent regarding nonverbal emotional material. To examine whether psychopaths exhibit general versus specific deficits in nonverbal emotional processing, 34 psychopaths and 33 nonpsychopaths identified with Hare's (R. D. Hare, 1991) Psychopathy Checklist-Revised were asked to complete a facial affect recognition test. Slides of prototypic facial expressions were presented. Three hypotheses regarding hemispheric lateralization anomalies in psychopaths were also tested (right-hemisphere dysfunction, reduced lateralization, and reversed lateralization). Psychopaths were less accurate than nonpsychopaths at classifying facial affect under conditions promoting reliance on right-hemisphere resources and displayed a specific deficit in classifying disgust. These findings demonstrate that psychopaths exhibit specific deficits in nonverbal emotional processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
In this study I used a temporal bisection task to test if greater overestimation of time due to negative emotion is moderated by individual differences in negative emotionality. The effects of fearful facial expressions on time perception were also examined. After a training phase, participants estimated the duration of facial expressions (anger, happiness, fearfulness) and a neutral-baseline facial expression. In accordance to the operation of an arousal-based process, the duration of angry expressions was consistently overestimated relative to other expressions and the baseline condition. In support of a role for individual differences in negative emotionality on time perception, temporal bias due to angry and fearful expressions was positively correlated to individual differences in self-reported negative emotionality. The results are discussed in relation both to the literature on attentional bias to facial expressions in anxiety and fearfulness and also, to the hypothesis that angry expressions evoke a fear-specific response. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号