首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Facial expressions are crucial to human social communication, but the extent to which they are innate and universal versus learned and culture dependent is a subject of debate. Two studies explored the effect of culture and learning on facial expression understanding. In Experiment 1, Japanese and U.S. participants interpreted facial expressions of emotion. Each group was better than the other at classifying facial expressions posed by members of the same culture. In Experiment 2, this reciprocal in-group advantage was reproduced by a neurocomputational model trained in either a Japanese cultural context or an American cultural context. The model demonstrates how each of us, interacting with others in a particular cultural context, learns to recognize a culture-specific facial expression dialect. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
The authors used connectionist modeling to extend previous research on emotion overgeneralization effects. Study 1 demonstrated that neutral expression male faces objectively resemble angry expressions more than female faces do, female faces objectively resemble surprise expressions more than male faces do, White faces objectively resemble angry expressions more than Black or Korean faces do, and Black faces objectively resemble happy and surprise expressions more than White faces do. Study 2 demonstrated that objective resemblance to emotion expressions influences trait impressions even when statistically controlling possible confounding influences of attractiveness and babyfaceness. It further demonstrated that emotion overgeneralization is moderated by face race and that racial differences in emotion resemblance contribute to White perceivers’ stereotypes of Blacks and Asians. These results suggest that intergroup relations may be strained not only by cultural stereotypes but also by adaptive responses to emotion expressions that are overgeneralized to groups whose faces subtly resemble particular emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Decoding facial expressions of emotion is an important aspect of social communication that is often impaired following psychiatric or neurological illness. However, little is known of the cognitive components involved in perceiving emotional expressions. Three dual task studies explored the role of verbal working memory in decoding emotions. Concurrent working memory load substantially interfered with choosing which emotional label described a facial expression (Experiment 1). A key factor in the magnitude of interference was the number of emotion labels from which to choose (Experiment 2). In contrast the ability to decide that two faces represented the same emotion in a discrimination task was relatively unaffected by concurrent working memory load (Experiment 3). Different methods of assessing emotion perception make substantially different demands on working memory. Implications for clinical disorders which affect both working memory and emotion perception are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Despite the fact that facial expressions of emotion have signal value, there is surprisingly little research examining how that signal can be detected under various conditions, because most judgment studies utilize full-face, frontal views. We remedy this by obtaining judgments of frontal and profile views of the same expressions displayed by the same expressors. We predicted that recognition accuracy when viewing faces in profile would be lower than when judging the same faces from the front. Contrarily, there were no differences in recognition accuracy as a function of view, suggesting that emotions are judged equally well regardless of from what angle they are viewed. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

5.
Efficient navigation of our social world depends on the generation, interpretation, and combination of social signals within different sensory systems. However, the influence of healthy adult aging on multisensory integration of emotional stimuli remains poorly explored. This article comprises 2 studies that directly address issues of age differences on cross-modal emotional matching and explicit identification. The first study compared 25 younger adults (19–40 years) and 25 older adults (60–80 years) on their ability to match cross-modal congruent and incongruent emotional stimuli. The second study looked at performance of 20 younger (19–40) and 20 older adults (60–80) on explicit emotion identification when information was presented congruently in faces and voices or only in faces or in voices. In Study 1, older adults performed as well as younger adults on tasks in which congruent auditory and visual emotional information were presented concurrently, but there were age-related differences in matching incongruent cross-modal information. Results from Study 2 indicated that though older adults were impaired at identifying emotions from 1 modality (faces or voices alone), they benefited from congruent multisensory information as age differences were eliminated. The findings are discussed in relation to social, emotional, and cognitive changes with age. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
7.
    
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
    
Categorical perception (CP) occurs when continuously varying stimuli are perceived as belonging to discrete categories. Thereby, perceivers are more accurate at discriminating between stimuli of different categories than between stimuli within the same category (Harnad, 1987; Goldstone, 1994). The current experiments investigated whether the structural information in the face is sufficient for CP to occur. Alternatively, a perceiver's conceptual knowledge, by virtue of expertise or verbal labeling, might contribute. In two experiments, people who differed in their conceptual knowledge (in the form of expertise, Experiment 1; or verbal label learning, Experiment 2) categorized chimpanzee facial expressions. Expertise alone did not facilitate CP. Only when perceivers first explicitly learned facial expression categories with a label were they more likely to show CP. Overall, the results suggest that the structural information in the face alone is often insufficient for CP; CP is facilitated by verbal labeling. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
P. Rozin and A. B. Cohen (see record 2003-02341-009) contend that confusion is an emotion because it is valenced, it has a distinct facial expression, and it has a distinct internal state. On the basis of these criteria, they call for further study of this unstudied state and challenge emotion researchers to consider "confusion" to be an emotion. The author agrees with Rozin and Cohen (2003) that confusion is an affective state, is valenced, has an (internal) object, may be expressed facially, and that laypersons may, under certain circumstances, consider it an emotion. However, its expression is likely to be an expressive component of emotions for which goal obstruction is central. Further, confusion may also not be as commonly considered an emotion by laypersons, as Rozin and Cohen contend. Finally, confusion is not unstudied, only most of the time it is not emotion researchers who do the researching. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Face recognition is thought to rely on configural visual processing, Where face recognition impairments have been identified, qualitatively delayed or anomalous configural processing has also been found. A group of women with Turner syndrome (TS) with monosomy for a single maternal X chromosome (45, Xm) showed an impairment in face recognition skills compared with normally developing women. However, normal configural face-processing abilities were apparent. The ability to recognize facial expressions of emotion, particularly fear, was also impaired in this TS subgroup. Face recognition and fear recognition accuracy were significantly correlated in the female control group but not in women with TS. The authors therefore suggest that anomalies in amygdala function may be a neurological feature of TS of this karyotype. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Objective: Difficulties in communication and social relationships present a formidable challenge for many people after traumatic brain injury (TBI). These difficulties are likely to be partially attributable to problems with emotion perception. Mounting evidence shows facial affect recognition to be particularly difficult after TBI. However, no attempt has been made to systematically estimate the magnitude of this problem or the frequency with which it occurs. Method: A meta-analysis is presented examining the magnitude of facial affect recognition difficulties after TBI. From this, the frequency of these impairments in the TBI population is estimated. Effect sizes were calculated from 13 studies that compared adults with moderate to severe TBI to matched healthy controls on static measures of facial affect recognition. Results: The studies collectively presented data from 296 adults with TBI and 296 matched controls. The overall weighted mean effect size for the 13 studies was ?1.11, indicating people with TBI on average perform about 1.1 SD below healthy peers on measures of facial affect recognition. Based on estimation of the TBI population standard deviation and modeling of likely distribution shape, it is estimated that between 13% and 39% of people with moderate to severe TBI may have significant difficulties with facial affect recognition, depending on the cut-off criterion used. Conclusion: This is clearly an area that warrants attention, particularly examining techniques for the rehabilitation of these deficits. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

12.
Reports an error in "Facial expressions of emotion influence memory for facial identity in an automatic way" by Arnaud D'Argembeau and Martial Van der Linden (Emotion, 2007[Aug], Vol 7[3], 507-515). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum. (The following abstract of the original article appeared in record 2007-11660-005.) Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Previous choice reaction time studies have provided consistent evidence for faster recognition of positive (e.g., happy) than negative (e.g., disgusted) facial expressions. A predominance of positive emotions in normal contexts may partly explain this effect. The present study used pleasant and unpleasant odors to test whether emotional context affects the happy face advantage. Results from 2 experiments indicated that happiness was recognized faster than disgust in a pleasant context, but this advantage disappeared in an unpleasant context because of the slow recognition of happy faces. Odors may modulate the functioning of those emotion-related brain structures that participate in the formation of the perceptual representations of the facial expressions and in the generation of the conceptual knowledge associated with the signaled emotion. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
The anteromedial temporal lobe has been found to participate in processing emotion, but there are unresolved discrepancies in the literature. To address this issue, the authors investigated recognition of emotion from faces and from prosody in 26 participants with unilateral temporal lobectomy (15 left, 11 right) and in 50 brain-damaged controls. Participants with right, but not left, temporal lobectomy did significantly worse in recognizing fear from facial expressions. There were no group differences in recognizing emotional prosody. Neither IQ nor basic perceptual function accounted for task performance; however, there was a moderate negative correlation between extent of amygdala damage and overall performance. Consistent with some prior studies, these findings support a role for the right anteromedial temporal lobe (including amygdala) in recognizing emotion from faces but caution in drawing conclusions from small group samples. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Two studies tested the hypothesis that in judging people's emotions from their facial expressions, Japanese, more than Westerners, incorporate information from the social context. In Study 1, participants viewed cartoons depicting a happy, sad, angry, or neutral person surrounded by other people expressing the same emotion as the central person or a different one. The surrounding people's emotions influenced Japanese but not Westerners' perceptions of the central person. These differences reflect differences in attention, as indicated by eye-tracking data (Study 2): Japanese looked at the surrounding people more than did Westerners. Previous findings on East-West differences in contextual sensitivity generalize to social contexts, suggesting that Westerners see emotions as individual feelings, whereas Japanese see them as inseparable from the feelings of the group. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
In this set of studies, we examine the perceptual similarities between emotions that share either a valence or a motivational direction. Determination is a positive approach-related emotion, whereas anger is a negative approach-related emotion. Thus, determination and anger share a motivational direction but are opposite in valence. An implemental mind-set has previously been shown to produce high-approach-motivated positive affect. Thus, in Study 1, participants were asked to freely report the strongest emotion they experienced during an implemental mind-set. The most common emotion reported was determination. On the basis of this result, we compared the facial expression of determination with that of anger. In Study 2, naive judges were asked to identify photographs of facial expressions intended to express determination, along with photographs intended to express basic emotions (joy, anger, sadness, fear, disgust, neutral). Correct identifications of intended determination expressions were correlated with misidentifications of the expressions as anger but not with misidentifications as any other emotion. This suggests that determination, a high-approach-motivated positive affect, is perceived as similar to anger. In Study 3, naive judges quantified the intensity of joy, anger, and determination expressed in photographs. The intensity of perceived determination was directly correlated with the intensity of perceived anger (a high-approach-motivated negative affect) and was inversely correlated with the intensity of perceived joy (a low-approach-motivated positive affect). These results demonstrate perceptual similarity between emotions that share a motivational direction but differ in valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
P. Rozin and A. B. Cohen's (see record 2003-02341-009) method of sending students out to observe each other in familiar circumstances undoubtedly exaggerated the apparent prevalence of confusion, concentration, and worry. The expressions they observed probably ranged from regulatory feedback and communicative signals to expressions of the "intellectual emotions" described by C. Darwin (1872/1965). Appraisal theories can easily accommodate these affective states; there is no need to postulate new "basic emotions" unless one adheres to a rigid categorical view of emotion. Finally, Rozin and Cohen have made a valuable contribution by reminding us of the importance of emotions related to interest. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Collaboration with a local newspaper "yielded a new set of pictures of facial expressions and enabled us to collect judgments on these expressions from 189 newspaper readers." Data based on responses of the 189 readers to each of the 16 posed pictures and data based on responses of 96 college students are presented in a table. The "agreement between the newspaper readers and the students is striking for both medians and quartiles." Collaboration with both local and national newspapers "might supply a very useful population for a wide variety of research problems." (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
    
We present an overview of a new multidisciplinary research program that focuses on haptic processing of human facial identity and facial expressions of emotion. A series of perceptual and neuroscience experiments with live faces and/or rigid three-dimensional facemasks is outlined. To date, several converging methodologies have been adopted: behavioural experimental studies with neurologically intact participants, neuropsychological behavioural research with prosopagnosic individuals, and neuroimaging studies using fMRI techniques. In each case, we have asked what would happen if the hands were substituted for the eyes. We confirm that humans can haptically determine both identity and facial expressions of emotion in facial displays at levels well above chance. Clearly, face processing is a bimodal phenomenon. The processes and representations that underlie such patterns of behaviour are also considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
In this article, the authors elaborate on 3 ideas advanced in P. Rozin and A. B. Cohen's (200302341-009) innovative study of facial expression. Taking a cue from their discovery of new expressive behaviors (e.g., the narrowed eyebrows), the authors review recent studies showing that emotions are conveyed in more channels than usually studied, including posture, gaze patterns, voice, and touch. Building on their claim that confusion has a distinct display, the authors review evidence showing distinct displays for 3 self-conscious emotions (embarrassment, shame, and pride), 5 positive emotions (amusement, desire, happiness, love, interest), and sympathy and compassion. Finally, the authors offer a functional definition of emotion to integrate these findings on "new" displays and emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号