首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Efficient navigation of our social world depends on the generation, interpretation, and combination of social signals within different sensory systems. However, the influence of healthy adult aging on multisensory integration of emotional stimuli remains poorly explored. This article comprises 2 studies that directly address issues of age differences on cross-modal emotional matching and explicit identification. The first study compared 25 younger adults (19–40 years) and 25 older adults (60–80 years) on their ability to match cross-modal congruent and incongruent emotional stimuli. The second study looked at performance of 20 younger (19–40) and 20 older adults (60–80) on explicit emotion identification when information was presented congruently in faces and voices or only in faces or in voices. In Study 1, older adults performed as well as younger adults on tasks in which congruent auditory and visual emotional information were presented concurrently, but there were age-related differences in matching incongruent cross-modal information. Results from Study 2 indicated that though older adults were impaired at identifying emotions from 1 modality (faces or voices alone), they benefited from congruent multisensory information as age differences were eliminated. The findings are discussed in relation to social, emotional, and cognitive changes with age. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Facial expressions of emotion are key cues to deceit (M. G. Frank & P. Ekman, 1997). Given that the literature on aging has shown an age-related decline in decoding emotions, we investigated (a) whether there are age differences in deceit detection and (b) if so, whether they are related to impairments in emotion recognition. Young and older adults (N = 364) were presented with 20 interviews (crime and opinion topics) and asked to decide whether each interview subject was lying or telling the truth. There were 3 presentation conditions: visual, audio, or audiovisual. In older adults, reduced emotion recognition was related to poor deceit detection in the visual condition for crime interviews only. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Three experiments evaluated whether facial expression can modulate the allocation of focused attention. Identification of emotionally expressive target faces was typically faster when they were flanked by identical (compatible) faces compared with when they were flanked by different (incompatible) faces. This flanker compatibility effect was significantly smaller when target faces expressed negative compared with positive emotion (see Experiment 1A); however, when the faces were altered to disrupt emotional expression, yet retain feature differences, equal flanker compatibility effects were observed (see Experiment 1B). The flanker-compatibility effect was also found to be smaller for negative target faces compared compatibility with neutral target faces, and for both negative and neutral target faces compared with positive target faces (see Experiment 2). These results suggest that the constriction of attention is influenced by facial expressions of emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Although positive and negative images enhance the visual processing of young adults, recent work suggests that a life-span shift in emotion processing goals may lead older adults to avoid negative images. To examine this tendency for older adults to regulate their intake of negative emotional information, the current study investigated age-related differences in the perceptual boost received by probes appearing over facial expressions of emotion. Visually-evoked event-related potentials were recorded from the scalp over cortical regions associated with visual processing as a probe appeared over facial expressions depicting anger, sadness, happiness, or no emotion. The activity of the visual system in response to each probe was operationalized in terms of the P1 component of the event-related potentials evoked by the probe. For young adults, the visual system was more active (i.e., greater P1 amplitude) when the probes appeared over any of the emotional facial expressions. However, for older adults, the visual system displayed reduced activity when the probe appeared over angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
Threatening, friendly, and neutral faces were presented to test the hypothesis of the facilitated perceptual processing of threatening faces. Dense sensor event-related brain potentials were measured while subjects viewed facial stimuli. Subjects had no explicit task for emotional categorization of the faces. Assessing early perceptual stimulus processing, threatening faces elicited an early posterior negativity compared with nonthreatening neutral or friendly expressions. Moreover, at later stages of stimulus processing, facial threat also elicited augmented late positive potentials relative to the other facial expressions, indicating the more elaborate perceptual analysis of these stimuli. Taken together, these data demonstrate the facilitated perceptual processing of threatening faces. Results are discussed within the context of an evolved module of fear (A. Ohman & S. Mineka, 2001). (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Social functioning deficits have long been a defining feature in schizophrenia, but relatively little research has examined how emotion responsivity influences functional outcome in this disorder. The goal of the current study was to begin to elucidate the relationships between emotion responsivity, social cognition, and functional outcome in schizophrenia. Participants were 40 outpatients diagnosed with schizophrenia or schizoaffective disorder according to the Diagnostic and Statistical Manual of Mental Disorders (4th ed.; American Psychiatric Association, 1994) and 40 controls. Each participant completed measures of emotion responsivity, social cognition (both emotion and social perception), and functional outcome. Individuals with schizophrenia demonstrated somewhat reduced emotion responsivity for positive and negative stimuli, as well as deficits in both social cognition and functional outcome, in comparison with controls. Additionally, results indicated that both social perception and emotion responsivity were positively correlated with functional outcome. Importantly, the relationship of emotion responsivity to functional outcome was not mediated by social perception and showed a significant relationship to functional outcome independent of social cognition. This finding suggests that emotion responsivity is an important factor in understanding functional outcome in schizophrenia. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Traditionally, the development of responsible behavior has been a primary aim of American education. Responsible behavior entails self-motivation and self-guidance, and not obedience and compliance to rules merely in response to external supervision, rewards, and punishment. External factors certainly play a major role in responsible behavior, but so too do social cognition and emotion. The purpose of this article is to present a brief review of research linking social cognition and emotion to responsible behavior. Implications for school psychologists are discussed, with a particular emphasis on the importance of developing and implementing prevention and intervention programs that address the multiple components of responsible behavior. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

9.
In 2 experiments, the authors tested predictions from cognitive models of social anxiety regarding attentional biases for social and nonsocial cues by monitoring eye movements to pictures of faces and objects in high social anxiety (HSA) and low social anxiety (LSA) individuals. Under no-stress conditions (Experiment 1), HSA individuals initially directed their gaze toward neutral faces, relative to objects, more often than did LSA participants. However, under social-evaluative stress (Experiment 2), HSA individuals showed reduced biases in initial orienting and maintenance of gaze on faces (cf. objects) compared with the LSA group. HSA individuals were also relatively quicker to look at emotional faces than neutral faces but looked at emotional faces for less time, compared with LSA individuals, consistent with a vigilant-avoidant pattern of bias. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Seven experiments investigated the finding that threatening schematic faces are detected more quickly than nonthreatening faces. Threatening faces with v-shaped eyebrows (angry and scheming expressions) were detected more quickly than nonthreatening faces with A-shaped eyebrows (happy and sad expressions). In contrast to the hypothesis that these effects were due to perceptual features unrelated to the face, no advantage was found for v-shaped eyebrows presented in a nonfacelike object. Furthermore, the addition of internal facial features (the eyes, or the nose and mouth) was necessary to produce the detection advantage for faces with v-shaped eyebrows. Overall, the results are interpreted as showing that the v-shaped eyebrow configuration affords easy detection, but only when other internal facial features are present. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
The current study examined age differences in the intensity of emotions experienced during social interactions. Because emotions are felt most intensely in situations central to motivational goals, age differences in emotional intensity may exist in social situations that meet the goals for one age group more than the other. Guided by theories of emotional intensity and socioemotional selectivity, it was hypothesized that social partner type would elicit different affective responses by age. Younger (n = 71) and older (n = 71) adults recalled experiences of positive and negative emotions with new friends, established friends, and family members from the prior week. Compared with younger adults, older adults reported lower intensity positive emotions with new friends, similarly intense positive emotions with established friends, and higher intensity positive emotions with family members. Older adults reported lower intensity negative emotions for all social partners than did younger adults, but this difference was most pronounced for interactions with new friends. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
13.
Evidence for A. J. Fridlund's (e.g., 1994) "behavioral ecology view" of human facial expression comes primarily from studies of smiling in response to positive emotional stimuli. Smiling may be a special case because it clearly can, and often does serve merely communicative functions. The present study was designated (a) to assess the generalizability of social context effects to facial expressions in response to negative emotional stimuli and (b) to examine whether these effects are mediated by social motives, as suggested by the behavioral ecology view. Pairs of friends or strangers viewed film clips that elicited different degrees of sad affect, in either the same or a different room; a control group participated alone. Dependent variables included facial activity, subjective emotion, and social motives. Displays of sadness were influenced by stimulus intensity and were lower in all social conditions than in the alone condition. Unexpectedly, social context effects were also found for smiling. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Previous choice reaction time studies have provided consistent evidence for faster recognition of positive (e.g., happy) than negative (e.g., disgusted) facial expressions. A predominance of positive emotions in normal contexts may partly explain this effect. The present study used pleasant and unpleasant odors to test whether emotional context affects the happy face advantage. Results from 2 experiments indicated that happiness was recognized faster than disgust in a pleasant context, but this advantage disappeared in an unpleasant context because of the slow recognition of happy faces. Odors may modulate the functioning of those emotion-related brain structures that participate in the formation of the perceptual representations of the facial expressions and in the generation of the conceptual knowledge associated with the signaled emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Four experiments tested the hypothesis that concerns about infidelity would lead people, particularly those displaying high chronic levels of romantic jealousy, to display a functionally coordinated set of implicit cognitive biases aimed at vigilantly processing attractive romantic rivals. Priming concerns about infidelity led people with high levels of chronic jealousy (but not those low in chronic jealousy) to attend vigilantly to physically attractive same-sex targets at an early stage of visual processing (Study 1), to strongly encode and remember attractive same-sex targets (Study 2), and to form implicit negative evaluations of attractive same-sex targets (Studies 3 and 4). In each case, effects were observed only for same-sex targets who were physically attractive—individuals who can pose especially potent threats to a person’s own romantic interests. These studies reveal a cascade of implicit, lower order cognitive processes underlying romantic rivalry and identify the individuals most likely to display those processes. At a broader conceptual level, this research illustrates the utility of integrating social cognitive and evolutionary approaches to psychological science. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Young (59) adults viewed videos in which the same individual committed a faux pas, or acted appropriately, toward his coworkers. Older participants did not discriminate appropriate and inappropriate behaviors as well as young participants. Older participants also scored lower than young participants on an extensive battery of emotion recognition tests, and emotion performance fully mediated age differences in faux pas discrimination. The results provide further evidence for the role of emotion perception in a range of important social deficits. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

18.
In this study I used a temporal bisection task to test if greater overestimation of time due to negative emotion is moderated by individual differences in negative emotionality. The effects of fearful facial expressions on time perception were also examined. After a training phase, participants estimated the duration of facial expressions (anger, happiness, fearfulness) and a neutral-baseline facial expression. In accordance to the operation of an arousal-based process, the duration of angry expressions was consistently overestimated relative to other expressions and the baseline condition. In support of a role for individual differences in negative emotionality on time perception, temporal bias due to angry and fearful expressions was positively correlated to individual differences in self-reported negative emotionality. The results are discussed in relation both to the literature on attentional bias to facial expressions in anxiety and fearfulness and also, to the hypothesis that angry expressions evoke a fear-specific response. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
The authors used connectionist modeling to extend previous research on emotion overgeneralization effects. Study 1 demonstrated that neutral expression male faces objectively resemble angry expressions more than female faces do, female faces objectively resemble surprise expressions more than male faces do, White faces objectively resemble angry expressions more than Black or Korean faces do, and Black faces objectively resemble happy and surprise expressions more than White faces do. Study 2 demonstrated that objective resemblance to emotion expressions influences trait impressions even when statistically controlling possible confounding influences of attractiveness and babyfaceness. It further demonstrated that emotion overgeneralization is moderated by face race and that racial differences in emotion resemblance contribute to White perceivers’ stereotypes of Blacks and Asians. These results suggest that intergroup relations may be strained not only by cultural stereotypes but also by adaptive responses to emotion expressions that are overgeneralized to groups whose faces subtly resemble particular emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Two studies examined whether appraisals can be differentially affected by subliminal anger and sadness primes. Participants from Singapore (Experiment 1) and China (Experiment 2) were exposed to either subliminal angry faces or subliminal sad faces. Supporting appraisal theories of emotions, participants exposed to subliminal angry faces were more likely to appraise negative events as caused by other people and those exposed to subliminal sad faces were more likely to appraise the same events as caused by situational factors. The results provide the first evidence for subliminal emotion-specific cognitive effects. They show that cognitive functions such as appraisals can be affected by subliminal emotional stimuli of the same valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号