首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Preschool children, 2 to 5 years of age, and adults posed the six facial expressions of happiness, surprise, anger, fear, sadness, and disgust before a videotape camera. Their poses were scored subsequently using the MAX system. The number of poses that included all components of the target expression (complete expressions) as well as the frequency of those that included only some of the components of the target expressions (partial expressions) were analyzed. Results indicated that 2-year-olds as a group failed to pose any face. Three-year-olds were a transitional group, posing happiness and surprise expressions but none of the remaining faces to any degree. Four- and 5-year-olds were similar to one another and differed from adults only on surprise and anger expressions. Adults were able to pose both these expressions. No group, including adults, posed fear and disgust well. Posing of happiness showed no change after 3 years of age. Consistent differences between partial and complete poses were observed particularly for the negative expressions of sadness, fear, and disgust. Implications of these results for socialization theories of emotion are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
The authors compared the accuracy of emotion decoding for nonlinguistic affect vocalizations, speech-embedded vocal prosody, and facial cues representing 9 different emotions. Participants (N = 121) decoded 80 stimuli from 1 of the 3 channels. Accuracy scores for nonlinguistic affect vocalizations and facial expressions were generally equivalent, and both were higher than scores for speech-embedded prosody. In particular, affect vocalizations showed superior decoding over the speech stimuli for anger, contempt, disgust, fear, joy, and sadness. Further, specific emotions that were decoded relatively poorly through speech-embedded prosody were more accurately identified through affect vocalizations, suggesting that emotions that are difficult to communicate in running speech can still be expressed vocally through other means. Affect vocalizations also showed superior decoding over faces for anger, contempt, disgust, fear, sadness, and surprise. Facial expressions showed superior decoding scores over both types of vocal stimuli for joy, pride, embarrassment, and “neutral” portrayals. Results are discussed in terms of the social functions served by various forms of nonverbal emotion cues and the communicative advantages of expressing emotions through particular channels. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Studied the development of the recognition of emotional facial expressions in children and of the factors influencing recognition accuracy. 80 elementary school students (aged 5–8 yrs) were asked to identify the emotions expressed in a series of facial photographs. Recognition performances were analyzed in relation to the type of emotion expressed (i.e., happiness, fear, anger, surprise, sadness, or disgust) and the intensity of the emotional expression. Age differences were determined. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Recent research has shown that pride, like the "basic" emotions of anger, disgust, fear, happiness, sadness, and surprise, has a distinct, nonverbal expression that can be recognized by adults (J. L. Tracy & R. W. Robins, 2004b). In 2 experiments, the authors examined whether young children can identify the pride expression and distinguish it from expressions of happiness and surprise. Results suggest that (a) children can recognize pride at above-chance levels by age 4 years; (b) children recognize pride as well as they recognize happiness; (c) pride recognition, like happiness and surprise recognition, improves from age 3 to 7 years; and (d) children's ability to recognize pride cannot be accounted for by the use of a process of elimination (i.e., an exclusion rule) to identify an unknown entity. These findings have implications for the development of emotion recognition and children's ability to perceive and communicate pride. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
The ability to perceive and interpret facial expressions of emotion improves throughout childhood. Although newborns have rudimentary perceptive abilities allowing them to distinguish several facial expressions, it is only at the end of the first year that infants seem to be able to assign meaning to emotional signals. The meaning infants assign to facial expressions is very broad, as it is limited to the judgment of emotional valence. Meaning becomes more specific between the second and the third year of life, as children begin to categorize facial signals in terms of discrete emotions. While the facial expressions of happiness, anger and sadness are accurately categorized by the third year, the categorization of expressions of fear, surprise and disgust shows a much slower developmental pattern. Moreover, the ability to judge the sincerity of facial expressions shows a slower developmental pattern, probably because of the subtle differences between genuine and non-genuine expressions. The available evidence indicates that school age children can distinguish genuine smiles from masked smiles and false smiles. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Facial expression is heralded as a communication system common to all human populations, and thus is generally accepted as a biologically based, universal behavior. Happiness, sadness, fear, anger, surprise, and disgust are universally recognized and produced emotions, and communication of these states is deemed essential in order to navigate the social environment. It is puzzling, however, how individuals are capable of producing similar facial expressions when facial musculature is known to vary greatly among individuals. Here, the authors show that although some facial muscles are not present in all individuals, and often exhibit great asymmetry (larger or absent on one side), the facial muscles that are essential in order to produce the universal facial expressions exhibited 100% occurrence and showed minimal gross asymmetry in 18 cadavers. This explains how universal facial expression production is achieved, implies that facial muscles have been selected for essential nonverbal communicative function, and yet also accommodate individual variation. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
We investigated adults' voluntary control of 20 facial action units theoretically associated with 6 basic emotions (happiness, fear, anger, surprise, sadness, and disgust). Twenty young adults were shown video excerpts of facial action units and asked to reproduce them as accurately as possible. Facial Action Coding System (FACS; Ekman & Friesen, 1978a) coding of the facial productions showed that young adults succeeded in activating 18 of the 20 target actions units, although they often coactivated other action units. Voluntary control was clearly better for some action units than for others, with a pattern of differences between action units consistent with previous work in children and adolescents. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Very few large-scale studies have focused on emotional facial expression recognition (FER) in 3-year-olds, an age of rapid social and language development. We studied FER in 808 healthy 3-year-olds using verbal and nonverbal computerized tasks for four basic emotions (happiness, sadness, anger, and fear). Three-year-olds showed differential performance on the verbal and nonverbal FER tasks, especially with respect to fear. That is to say, fear was one of the most accurately recognized facial expressions as matched nonverbally and the least accurately recognized facial expression as labeled verbally. Sex did not influence emotion-matching nor emotion-labeling performance after adjusting for basic matching or labeling ability. Three-year-olds made systematic errors in emotion-labeling. Namely, happy expressions were often confused with fearful expressions, whereas negative expressions were often confused with other negative expressions. Together, these findings suggest that 3-year-olds' FER skills strongly depend on task specifications. Importantly, fear was the most sensitive facial expression in this regard. Finally, in line with previous studies, we found that recognized emotion categories are initially broad, including emotions of the same valence, as reflected in the nonrandom errors of 3-year-olds. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

10.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Photographs (study 1) or line-drawing representations (study 2) of posed facial expressions and a list of emotion words (happiness, surprise, fear, disgust, anger, sadness, neutral) were presented to two groups of observers who were asked to match the photographs or line drawings, respectively, with the emotion categories provided. A multidimensional-scaling procedure was applied to the judgment data. Two dimensions were revealed; pleasantness--unpleasantness and upper-face--lower-face dominance. Furthermore, the similarity shown by the two-dimensional structures derived first from the judgments of photographs and second from the line drawings suggests that line drawings are a viable alternative to photographs in facial-expression research.  相似文献   

12.
Emotion theorists assume certain facial displays to convey information about the expresser's emotional state. In contrast, behavioral ecologists assume them to indicate behavioral intentions or action requests. To test these contrasting positions, over 2,000 online participants were presented with facial expressions and asked what they revealed--feeling states, behavioral intentions, or action requests. The majority of the observers chose feeling states as the message of facial expressions of disgust, fear, sadness, happiness, and surprise, supporting the emotions view. Only the anger display tended to elicit more choices of behavioral intention or action request, partially supporting the behavioral ecology view. The results support the view that facial expressions communicate emotions, with emotions being multicomponential phenomena that comprise feelings, intentions, and wishes. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Studied differences between dimensional and categorical judgments of static and dynamic spontaneous facial expressions of emotion. In the 1st part of the study, 25 university students presented with either static or dynamic facial expressions of emotions (i.e., joy, fear, anger, surprise, disgust, and sadness) and asked to evaluate the similarity of 21 pairs of stimuli on a 7-point scale. Results were analyzed using a multidimensional scaling procedure. In the 2nd part of the study, Ss were asked to categorize the expressed emotions according to their intensity. Differences in the categorization of static and dynamic stimuli were analyzed. Results from the similarity rating task and the categorization task were compared. (English abstract) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
In an experiment with 20 undergraduates, video recordings of actors' faces covered with black makeup and white spots were played back to the Ss so that only the white spots were visible. The results demonstrate that moving displays of happiness, sadness, fear, surprise, anger, and disgust were recognized more accurately than static displays of the white spots at the apex of the expressions. This indicates that facial motion, in the absence of information about the shape and position of facial features, is informative about these basic emotions. Normally illuminated dynamic displays of these expressions, however, were recognized more accurately than displays of moving spots. The relative effectiveness of upper and lower facial areas for the recognition of the 6 emotions was also investigated using normally illuminated and spots-only displays. In both instances, the results indicate that different facial regions are more informative for different emotions. (20 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Five studies investigated the young infant's ability to produce identifiable emotion expressions as defined in differential emotions theory. Trained judges applied emotion-specific criteria in selecting expression stimuli from videotape recordings of 54 1–9 mo old infants' responses to a variety of incentive events, ranging from playful interactions to the pain of inoculations. Four samples of untrained Ss (130 undergraduates and 62 female health service professionals) confirmed the social validity of infants' emotion expressions by reliably identifying expressions of interest, joy, surprise, sadness, anger, disgust, contempt, and fear. Brief training resulted in significant increases in the accuracy of discrimination of infants' negative emotion expressions for low-accuracy Ss. Construct validity for the 8 emotion expressions identified by untrained Ss and for a consistent pattern of facial responses to unanticipated pain was provided by expression identifications derived from an objective, theoretically structured, anatomically based facial movement coding system. (21 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Investigated the hypothesis that reading difficulties of learning disabled children are attributable to deficiencies in verbal encoding. Adopting a probe-type serial memory task, 60 normal and learning disabled readers matched on CA (9 yrs old), IQ, and sex were compared on recall performance after pretraining of named and unnamed stimulus conditions. The named condition for normal readers was superior in terms of recall performance. Consistent with the findings of F. Vellutino et al (1972, 1973, 1975), no difference was found in recall of nonverbal stimuli between normal and learning disabled readers. These data suggest that primary reading deficits in learning disabled children are related to verbal encoding deficiencies (visual–verbal integration) and not to deficiencies of visual memory, as suggested by the perceptual deficit hypothesis. (19 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Two studies provided direct support for a recently proposed dialect theory of communicating emotion, positing that expressive displays show cultural variations similar to linguistic dialects, thereby decreasing accurate recognition by out-group members. In Study 1, 60 participants from Quebec and Gabon posed facial expressions. Dialects, in the form of activating different muscles for the same expressions, emerged most clearly for serenity, shame, and contempt and also for anger, sadness, surprise, and happiness, but not for fear, disgust, or embarrassment. In Study 2, Quebecois and Gabonese participants judged these stimuli and stimuli standardized to erase cultural dialects. As predicted, an in-group advantage emerged for nonstandardized expressions only and most strongly for expressions with greater regional dialects, according to Study 1. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
A total of 74 Ss were induced to adopt expressions of fear, anger, disgust, and sadness in Experiment 1. Each expression significantly increased feelings of its particular emotion compared with at least two of the others, a result that cannot be explained by a single dimension. Postures should play the same role in emotional experience as facial expressions. However, the demonstrated effects of postures (Riskind, 1984) could also represent a single dimension of variation. In Experiment 2, subjects were induced to adopt postures characteristic of fear, anger, and sadness. Again, the effects were specific to the postures. These two studies indicate that emotional behavior produces changes in feelings that specifically match the behavior. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Prior studies provide consistent evidence of deficits for psychopaths in processing verbal emotional material but are inconsistent regarding nonverbal emotional material. To examine whether psychopaths exhibit general versus specific deficits in nonverbal emotional processing, 34 psychopaths and 33 nonpsychopaths identified with Hare's (R. D. Hare, 1991) Psychopathy Checklist-Revised were asked to complete a facial affect recognition test. Slides of prototypic facial expressions were presented. Three hypotheses regarding hemispheric lateralization anomalies in psychopaths were also tested (right-hemisphere dysfunction, reduced lateralization, and reversed lateralization). Psychopaths were less accurate than nonpsychopaths at classifying facial affect under conditions promoting reliance on right-hemisphere resources and displayed a specific deficit in classifying disgust. These findings demonstrate that psychopaths exhibit specific deficits in nonverbal emotional processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号