共查询到20条相似文献,搜索用时 15 毫秒
1.
Hunter Edyta Monika; Phillips Louise H.; MacPherson Sarah E. 《Canadian Metallurgical Quarterly》2010,25(4):779
Efficient navigation of our social world depends on the generation, interpretation, and combination of social signals within different sensory systems. However, the influence of healthy adult aging on multisensory integration of emotional stimuli remains poorly explored. This article comprises 2 studies that directly address issues of age differences on cross-modal emotional matching and explicit identification. The first study compared 25 younger adults (19–40 years) and 25 older adults (60–80 years) on their ability to match cross-modal congruent and incongruent emotional stimuli. The second study looked at performance of 20 younger (19–40) and 20 older adults (60–80) on explicit emotion identification when information was presented congruently in faces and voices or only in faces or in voices. In Study 1, older adults performed as well as younger adults on tasks in which congruent auditory and visual emotional information were presented concurrently, but there were age-related differences in matching incongruent cross-modal information. Results from Study 2 indicated that though older adults were impaired at identifying emotions from 1 modality (faces or voices alone), they benefited from congruent multisensory information as age differences were eliminated. The findings are discussed in relation to social, emotional, and cognitive changes with age. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
2.
Appraisal theories of emotion propose that the emotions people experience correspond to their appraisals of their situation. In other words, individual differences in emotional experiences reflect differing interpretations of the situation. We hypothesized that in similar situations, people in individualist and collectivist cultures experience different emotions because of culturally divergent causal attributions for success and failure (i.e., agency appraisals). In a test of this hypothesis, American and Japanese participants recalled a personal experience (Study 1) or imagined themselves to be in a situation (Study 2) in which they succeeded or failed, and then reported their agency appraisals and emotions. Supporting our hypothesis, cultural differences in emotions corresponded to differences in attributions. For example, in success situations, Americans reported stronger self-agency emotions (e.g., proud) than did Japanese, whereas Japanese reported a stronger situation-agency emotion (lucky). Also, cultural differences in attribution and emotion were largely explained by differences in self-enhancing motivation. When Japanese and Americans were induced to make the same attribution (Study 2), cultural differences in emotions became either nonsignificant or were markedly reduced. (PsycINFO Database Record (c) 2011 APA, all rights reserved) 相似文献
3.
Facial expressions of emotion are key cues to deceit (M. G. Frank & P. Ekman, 1997). Given that the literature on aging has shown an age-related decline in decoding emotions, we investigated (a) whether there are age differences in deceit detection and (b) if so, whether they are related to impairments in emotion recognition. Young and older adults (N = 364) were presented with 20 interviews (crime and opinion topics) and asked to decide whether each interview subject was lying or telling the truth. There were 3 presentation conditions: visual, audio, or audiovisual. In older adults, reduced emotion recognition was related to poor deceit detection in the visual condition for crime interviews only. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
4.
Isaacowitz Derek M.; L?ckenhoff Corinna E.; Lane Richard D.; Wright Ron; Sechrest Lee; Riedel Robert; Costa Paul T. 《Canadian Metallurgical Quarterly》2007,22(1):147
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
5.
Two studies provide evidence for the role of cultural familiarity in recognizing facial expressions of emotion. For Chinese located in China and the United States, Chinese Americans, and non-Asian Americans, accuracy and speed in judging Chinese and American emotions was greater with greater participant exposure to the group posing the expressions. Likewise, Tibetans residing in China and Africans residing in the United States were faster and more accurate when judging emotions expressed by host versus nonhost society members. These effects extended across generations of Chinese Americans, seemingly independent of ethnic or biological ties. Results suggest that the universal affect system governing emotional expression may be characterized by subtle differences in style across cultures, which become more familiar with greater cultural contact. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
6.
The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion–identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
7.
Niedenthal Paula M.; Brauer Markus; Robin Lucy; Innes-Ker ?se H. 《Canadian Metallurgical Quarterly》2002,82(3):419
Adult attachment orientation has been associated with specific patterns of emotion regulation. The present research examined the effects of attachment orientation on the perceptual processing of emotional stimuli. Experimental participants played computerized movies of faces that expressed happiness, sadness, and anger. Over the course of the movies, the facial expressions became neutral. Participants reported the frame at which the initial expression no longer appeared on the face. Under conditions of no distress (Study 1), fearfully attached individuals saw the offset of both happiness and anger earlier, and preoccupied and dismissive individuals later, than the securely attached individuals. Under conditions of distress (Study 2), insecurely attached individuals perceived the offset of negative facial expressions as occurring later than did the secure individuals, and fearfully attached individuals saw the offset later than either of the other insecure groups. The mechanisms underlying the effects are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
8.
Davis F. Caroline; Somerville Leah H.; Ruberry Erika J.; Berry Andrew B. L.; Shin Lisa M.; Whalen Paul J. 《Canadian Metallurgical Quarterly》2011,11(3):647
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as “threat-related,” because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved) 相似文献
9.
Despite the fact that facial expressions of emotion have signal value, there is surprisingly little research examining how that signal can be detected under various conditions, because most judgment studies utilize full-face, frontal views. We remedy this by obtaining judgments of frontal and profile views of the same expressions displayed by the same expressors. We predicted that recognition accuracy when viewing faces in profile would be lower than when judging the same faces from the front. Contrarily, there were no differences in recognition accuracy as a function of view, suggesting that emotions are judged equally well regardless of from what angle they are viewed. (PsycINFO Database Record (c) 2011 APA, all rights reserved) 相似文献
10.
Gaze direction influences younger adults' perception of emotional expressions, with direct gaze enhancing the perception of anger and joy, while averted gaze enhances the perception of fear. Age-related declines in emotion recognition and eye-gaze processing have been reported, indicating that there may be age-related changes in the ability to integrate these facial cues. As there is evidence of a positivity bias with age, age-related difficulties integrating these cues may be greatest for negative emotions. The present research investigated age differences in the extent to which gaze direction influenced explicit perception (e.g., anger, fear and joy; Study 1) and social judgments (e.g., of approachability; Study 2) of emotion faces. Gaze direction did not influence the perception of fear in either age group. In both studies, age differences were found in the extent to which gaze direction influenced judgments of angry and joyful faces, with older adults showing less integration of gaze and emotion cues than younger adults. Age differences were greatest when interpreting angry expressions. Implications of these findings for older adults' social functioning are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
11.
Fugate Jennifer M. B.; Gouzoules Harold; Barrett Lisa Feldman 《Canadian Metallurgical Quarterly》2010,10(4):544
Categorical perception (CP) occurs when continuously varying stimuli are perceived as belonging to discrete categories. Thereby, perceivers are more accurate at discriminating between stimuli of different categories than between stimuli within the same category (Harnad, 1987; Goldstone, 1994). The current experiments investigated whether the structural information in the face is sufficient for CP to occur. Alternatively, a perceiver's conceptual knowledge, by virtue of expertise or verbal labeling, might contribute. In two experiments, people who differed in their conceptual knowledge (in the form of expertise, Experiment 1; or verbal label learning, Experiment 2) categorized chimpanzee facial expressions. Expertise alone did not facilitate CP. Only when perceivers first explicitly learned facial expression categories with a label were they more likely to show CP. Overall, the results suggest that the structural information in the face alone is often insufficient for CP; CP is facilitated by verbal labeling. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
12.
The ability to allocate attention to emotional cues in the enviromnent is an important feature of adaptive self-regulation. Existing data suggest that physically abused children overattend to angry expressions, but the attentional mechanisms underlying such behavior are unknown. The authors tested 8-11-year-old physically abused children to determine whether they displayed specific information-processing problems in a selective attention paradigm using emotional faces as cues. Physically abused children demonstrated delayed disengagement when angry faces served as invalid cues. Abused children also demonstrated increased attentional benefits on valid angry trials. Results are discussed in terms of the influence of early adverse experience on children's selective attention to threat-related signals as a mechanism in the development of psychopathology. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
13.
Dailey Matthew N.; Joyce Carrie; Lyons Michael J.; Kamachi Miyuki; Ishi Hanae; Gyoba Jiro; Cottrell Garrison W. 《Canadian Metallurgical Quarterly》2010,10(6):874
Facial expressions are crucial to human social communication, but the extent to which they are innate and universal versus learned and culture dependent is a subject of debate. Two studies explored the effect of culture and learning on facial expression understanding. In Experiment 1, Japanese and U.S. participants interpreted facial expressions of emotion. Each group was better than the other at classifying facial expressions posed by members of the same culture. In Experiment 2, this reciprocal in-group advantage was reproduced by a neurocomputational model trained in either a Japanese cultural context or an American cultural context. The model demonstrates how each of us, interacting with others in a particular cultural context, learns to recognize a culture-specific facial expression dialect. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
14.
Emotion regulation and culture: Are the social consequences of emotion suppression culture-specific?
Emotional suppression has been associated with generally negative social consequences (Butler et al., 2003; Gross & John, 2003). A cultural perspective suggests, however, that these consequences may be moderated by cultural values. We tested this hypothesis in a two-part study, and found that, for Americans holding Western-European values, habitual suppression was associated with self-protective goals and negative emotion. In addition, experimentally elicited suppression resulted in reduced interpersonal responsiveness during face-to-face interaction, along with negative partner-perceptions and hostile behavior. These deleterious effects were reduced when individuals with more Asian values suppressed, and these reductions were mediated by cultural differences in the responsiveness of the suppressors. These findings suggest that many of suppression's negative social impacts may be moderated by cultural values. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
15.
A number of past studies have used the visual search paradigm to examine whether certain aspects of emotional faces are processed preattentively and can thus be used to guide attention. All these studies presented static depictions of facial prototypes. Emotional expressions conveyed by the movement patterns of the face have never been examined for their preattentive effect. The present study presented for the first time dynamic facial expressions in a visual search paradigm. Experiment 1 revealed efficient search for a dynamic angry face among dynamic friendly faces, but inefficient search in a control condition with static faces. Experiments 2 to 4 suggested that this pattern of results is due to a stronger movement signal in the angry than in the friendly face: No (strong) advantage of dynamic over static faces is revealed when the degree of movement is controlled. These results show that dynamic information can be efficiently utilized in visual search for facial expressions. However, these results do not generally support the hypothesis that emotion-specific movement patterns are always preattentively discriminated. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
16.
Jakobs Esther; Manstead Antony S. R.; Fischer Agneta H. 《Canadian Metallurgical Quarterly》2001,1(1):51
Evidence for A. J. Fridlund's (e.g., 1994) "behavioral ecology view" of human facial expression comes primarily from studies of smiling in response to positive emotional stimuli. Smiling may be a special case because it clearly can, and often does serve merely communicative functions. The present study was designated (a) to assess the generalizability of social context effects to facial expressions in response to negative emotional stimuli and (b) to examine whether these effects are mediated by social motives, as suggested by the behavioral ecology view. Pairs of friends or strangers viewed film clips that elicited different degrees of sad affect, in either the same or a different room; a control group participated alone. Dependent variables included facial activity, subjective emotion, and social motives. Displays of sadness were influenced by stimulus intensity and were lower in all social conditions than in the alone condition. Unexpectedly, social context effects were also found for smiling. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
17.
Reports an error in "Facial expressions of emotion influence memory for facial identity in an automatic way" by Arnaud D'Argembeau and Martial Van der Linden (Emotion, 2007[Aug], Vol 7[3], 507-515). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum. (The following abstract of the original article appeared in record 2007-11660-005.) Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
18.
Mienaltowski Andrew; Corballis Paul M.; Blanchard-Fields Fredda; Parks Nathan A.; Hilimire Matthew R. 《Canadian Metallurgical Quarterly》2011,26(1):224
Although positive and negative images enhance the visual processing of young adults, recent work suggests that a life-span shift in emotion processing goals may lead older adults to avoid negative images. To examine this tendency for older adults to regulate their intake of negative emotional information, the current study investigated age-related differences in the perceptual boost received by probes appearing over facial expressions of emotion. Visually-evoked event-related potentials were recorded from the scalp over cortical regions associated with visual processing as a probe appeared over facial expressions depicting anger, sadness, happiness, or no emotion. The activity of the visual system in response to each probe was operationalized in terms of the P1 component of the event-related potentials evoked by the probe. For young adults, the visual system was more active (i.e., greater P1 amplitude) when the probes appeared over any of the emotional facial expressions. However, for older adults, the visual system displayed reduced activity when the probe appeared over angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved) 相似文献
19.
Infants' responsiveness to others' affective expressions was investigated in the context of a peekaboo game. Forty 4-month-olds participated in a peekaboo game in which the typical happy/surprised expression was systematically replaced with a different emotion, depending on group assignment. Infants viewed three typical peekaboo trials followed by a change (anger, fear, or sadness) or no-change (happiness/surprise) trial, repeated over two blocks. Infants' looking time and affective responsiveness were measured. Results revealed differential patterns of visual attention and affective responsiveness to each emotion. These results underscore the importance of contextual information for facilitating recognition of emotion expressions as well as the efficacy of using converging measures to assess such understanding. Infants as young as 4 months appear to discriminate and respond in meaningful ways to others' emotion expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
20.
In this study I used a temporal bisection task to test if greater overestimation of time due to negative emotion is moderated by individual differences in negative emotionality. The effects of fearful facial expressions on time perception were also examined. After a training phase, participants estimated the duration of facial expressions (anger, happiness, fearfulness) and a neutral-baseline facial expression. In accordance to the operation of an arousal-based process, the duration of angry expressions was consistently overestimated relative to other expressions and the baseline condition. In support of a role for individual differences in negative emotionality on time perception, temporal bias due to angry and fearful expressions was positively correlated to individual differences in self-reported negative emotionality. The results are discussed in relation both to the literature on attentional bias to facial expressions in anxiety and fearfulness and also, to the hypothesis that angry expressions evoke a fear-specific response. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献