首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The ability to perceive and interpret facial expressions of emotion improves throughout childhood. Although newborns have rudimentary perceptive abilities allowing them to distinguish several facial expressions, it is only at the end of the first year that infants seem to be able to assign meaning to emotional signals. The meaning infants assign to facial expressions is very broad, as it is limited to the judgment of emotional valence. Meaning becomes more specific between the second and the third year of life, as children begin to categorize facial signals in terms of discrete emotions. While the facial expressions of happiness, anger and sadness are accurately categorized by the third year, the categorization of expressions of fear, surprise and disgust shows a much slower developmental pattern. Moreover, the ability to judge the sincerity of facial expressions shows a slower developmental pattern, probably because of the subtle differences between genuine and non-genuine expressions. The available evidence indicates that school age children can distinguish genuine smiles from masked smiles and false smiles. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
In Exp I, photos of 15 female target persons posing happy, neutral, and sad facial expressions were rated by 257 undergraduates for facial attractiveness using paired comparisons and Likert scales. Half of the raters were instructed to compensate for the effects of facial expression. Paired comparisons and Likert ratings were highly correlated. Target persons were less attractive when posing sad expressions than when posing neutral or happy expressions, which did not differ. In addition, independent ratings of 4 dimensions of the target persons' facial expression were obtained: pleasantness, surprise, intensity, and naturalness. Changes in these dimensions from the neutral to the happy and sad expressions and the corresponding changes in attractiveness were consistently related only to pleasantness, supporting the reinforcement-affect theory of attraction. Exp II, with 21 male undergraduates, related overall attractiveness to facial and bodily attractiveness. Both facial and bodily attractiveness were predictive of overall attractiveness, but the face was a slightly more powerful predictor. Results are discussed with respect to the stability of physical attractiveness, and alternative explanations of the mental-illness/physical-unattractiveness relation are proposed. (54 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Preschool children, 2 to 5 years of age, and adults posed the six facial expressions of happiness, surprise, anger, fear, sadness, and disgust before a videotape camera. Their poses were scored subsequently using the MAX system. The number of poses that included all components of the target expression (complete expressions) as well as the frequency of those that included only some of the components of the target expressions (partial expressions) were analyzed. Results indicated that 2-year-olds as a group failed to pose any face. Three-year-olds were a transitional group, posing happiness and surprise expressions but none of the remaining faces to any degree. Four- and 5-year-olds were similar to one another and differed from adults only on surprise and anger expressions. Adults were able to pose both these expressions. No group, including adults, posed fear and disgust well. Posing of happiness showed no change after 3 years of age. Consistent differences between partial and complete poses were observed particularly for the negative expressions of sadness, fear, and disgust. Implications of these results for socialization theories of emotion are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
The ability to imitate facial expressions was examined in 26 newborns. Each subject observed a model showing tongue protrusion or a happy, sad, or suprised face. The frequencies of reproduction of a modeled act were compared with the average frequencies of the act during periods when other actions were modeled. A trials-to-criterion design was used. When infants observed the emotional facial expressions (happy, sad, surprise), they often responded by opening their mouths or showing lip pouts, but did not show imitative-like matching of these modeled expressions. However, when tongue protrusion was modeled, the infants did reproduce the modeled gesture. These data raise the question of whether the infants' responses to modeled facial expressions reflect true imitation, stimulus-evoked elicitation, or a stereotyped "facial gesture." (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
In this study, the authors demonstrated that 6-month-old infants are able to categorize natural, 650 Hz low-pass filtered infant-directed utterances. In Experiment 1, 24 male and 24 female infants heard 7 different tokens from 1 class of utterance (comforting or approving). Then, some infants heard a novel test stimulus from the familiar class of tokens; others heard a test stimulus from the unfamiliar class. Infants categorized these tokens as evidenced by response recovery to tokens from the unfamiliar class but not to novel tokens from the familiar class. Experiment 2 confirmed that the infants were able to discriminate between closely matched tokens from within each category, supporting the conclusion that the results of Experiment 1 indicated categorization. The authors discuss both a mechanism that might explain the development of this ability and the mutual adaptation seen in parent–infant communication. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Three studies tested whether infant facial expressions selected to fit Max formulas (C. E. Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress, but Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, the facial muscle components of faces shown in Studies 1 and 2 were coded with the Facial Action Coding System (FACS; P. Ekman and W. V. Friesen, 1978) and Baby FACS (H. Oster and D. Rosenstein, in press). Only 3 of 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes. Results indicate that negative affect expressions are not fully differentiated in infants and that empirical studies of infant facial expressions are needed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

9.
Emotion researchers often categorize angry and fearful face stimuli as "negative" or "threatening". Perception of fear and anger, however, appears to be mediated by dissociable neural circuitries and often elicit distinguishable behavioral responses. The authors sought to elucidate whether viewing anger and fear expressions produce dissociable psychophysiological responses (i.e., the startle reflex). The results of two experiments using different facial stimulus sets (representing anger, fear, neutral, and happy) indicated that viewing anger was associated with a significantly heightened startle response (p  相似文献   

10.
The literature on infants' perception of facial and vocal expressions, combined with data from studies on infant-directed speech, mother-infant interaction, and social referencing, supports the view that infants come to recognize the affective expressions of others through a perceptual differentiation process. Recognition of affective expressions changes from a reliance on multimodally presented information to the recognition of vocal expressions and then of facial expressions alone. Face or voice properties become differentiated and discriminated from the whole, standing for the entire emotional expression. Initially, infants detect information that potentially carries the meaning of emotional expressions; only later do infants discriminate and then recognize those expressions. The author reviews data supporting this view and draws parallels between the perceptions of affective expressions and of speech. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Examined intermodal perception of vocal and facial expressions in 2 experiments with 16 5- and 16 7-mo-olds. Two filmed facial expressions were presented with a single vocal expression characteristic of 1 of the facial expressions (angry or happy). The lower third of each face was obscured, so Ss could not simply match lip movements to the voice. Overall findings indicate that only 7-mo-olds increased their fixation to a facial expression when it was sound-specified. Older infants evidently detected information that was invariant across the presentations of a single affective expression, despite degradation of temporal synchrony information. The 5-mo-olds' failure to look differentially is explained by the possibilities that (1) 5-mo-olds may need to see the whole face for any discrimination of expressions to occur; (2) they cannot discriminate films of happy and angry facial expressions even with the full face available; or (3) they rely heavily on temporal information for the discrimination of facial expressions and/or the intermodal perception of bimodally presented expressions, although not for articulatory patterns. Preferences for a particular expression were not found: Infants did not look longer at the happy or the angry facial expression, independent of the sound manipulation, suggesting that preferences for happy expressions found in prior studies may rest on attention to the "toothy" smile. (25 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Studied the effect of maternal facial expressions of emotion on 108 12-mo-old infants in 4 studies. The deep side of a visual cliff was adjusted to a height that produced no clear avoidance and much referencing of the mother. In Study 1, 19 Ss viewed a facial expression of joy, while 17 Ss viewed one of fear. In Study 2, 15 Ss viewed interest, while 18 Ss viewed anger. In Study 3, 19 Ss viewed sadness. In Study 4, 23 Ss were used to determine whether the expressions influenced Ss' evaluation of an ambiguous situation or whether they were effective in controlling behavior merely because of their discrepancy or unexpectedness. Results show that Ss used facial expressions to disambiguate situations. If a mother posed joy or interest while S referenced, most Ss crossed the deep side. If a mother posed fear or anger, few Ss crossed. In the absence of any depth whatsoever, few Ss referenced the mother and those who did, while the mother was posing fear, hesitated but crossed nonetheless. It is suggested that facial expressions regulate behavior most clearly in contexts of uncertainty. (17 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Previous research has demonstrated that particular facial expressions more readily acquire excitatory strength when paired with a congruent unconditioned stimulus than when paired with an incongruent outcome. The present study with a total of 36 undergraduates extends these findings on the excitatory inhibitory role of facial expressions by demonstrating that particular facial expressions (fear and happy), when paired with a neutral cue (tone), can influence conditioning to the neutral conditioned stimulus (CS). Ss who had a fear expression paired with the neutral CS responded more to the fear expression than to the neutral CS, whereas Ss who had a happy expression paired with the neutral CS responded more to the neutral cue than to the happy expression. These findings strongly support predictions from "overshadowing" or "blocking" models of classical conditioning. (12 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
The amygdala is thought to play a crucial role in emotional and social behaviour. Animal studies implicate the amygdala in both fear conditioning and face perception. In humans, lesions of the amygdala can lead to selective deficits in the recognition of fearful facial expressions and impaired fear conditioning, and direct electrical stimulation evokes fearful emotional responses. Here we report direct in vivo evidence of a differential neural response in the human amygdala to facial expressions of fear and happiness. Positron-emission tomography (PET) measures of neural activity were acquired while subjects viewed photographs of fearful or happy faces, varying systematically in emotional intensity. The neuronal response in the left amygdala was significantly greater to fearful as opposed to happy expressions. Furthermore, this response showed a significant interaction with the intensity of emotion (increasing with increasing fearfulness, decreasing with increasing happiness). The findings provide direct evidence that the human amygdala is engaged in processing the emotional salience of faces, with a specificity of response to fearful facial expressions.  相似文献   

15.
Two experiments competitively test 3 potential mechanisms (negativity inhibiting responses, feature-based accounts, and evaluative context) for the response latency advantage for recognizing happy expressions by investigating how the race of a target can moderate the strength of the effect. Both experiments indicate that target race modulates the happy face advantage, such that European American participants displayed the happy face advantage for White target faces, but displayed a response latency advantage for angry (Experiments 1 and 2) and sad (Experiment 2) Black target faces. This pattern of findings is consistent with an evaluative context mechanism and inconsistent with negativity inhibition and feature-based accounts of the happy face advantage. Thus, the race of a target face provides an evaluative context in which facial expressions are categorized. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Examined the influence of changes in facial expression on physiological and self-report measures of emotion. In Exp I, 27 undergraduates portrayed facial expressions associated with being afraid, calm, and normal. Portraying fear produced increases in pulse rate and skin conductance relative to portraying either calm or normal, but posing had no effect on subjective reports of anxiety (Affect Adjective Check List). In Exp II, 38 Ss listened to loud or soft noise while changing their expressions to portray fear, happiness, or calmness. Portraying either fear or happiness produced greater arousal than remaining calm. Changes in facial expression failed to affect self-reports of noise loudness. Results suggest that changes in facial expression influence physiological responses through the movement involved in posing and may not influence self-reports of emotion at all. (18 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Responses to mothers' presentations of happy, sad, and angry faces were studied in a sample of 12 infants, 6 boys and 6 girls at age 10 weeks?±?5 days. Each infant's mother displayed noncontingent, practiced facial and vocal expressions of the 3 emotions. Each expression occurred 4 times, with a 20-s head-turn-away between presentations. The orders of presentation were randomly assigned within sex of infant. Mothers' and infants' facial behaviors were coded using the Maximally Discriminative Facial Movement Coding System. The data indicated that (a) the infants discriminated each emotion, (b) apparent matching responses may occur under some conditions but not all, and (c) these apparent matching responses were only a part of nonrandom behavior patterns indicating induced emotional or affective responses of infants to mothers' expressions. (36 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Infants' responsiveness to others' affective expressions was investigated in the context of a peekaboo game. Forty 4-month-olds participated in a peekaboo game in which the typical happy/surprised expression was systematically replaced with a different emotion, depending on group assignment. Infants viewed three typical peekaboo trials followed by a change (anger, fear, or sadness) or no-change (happiness/surprise) trial, repeated over two blocks. Infants' looking time and affective responsiveness were measured. Results revealed differential patterns of visual attention and affective responsiveness to each emotion. These results underscore the importance of contextual information for facilitating recognition of emotion expressions as well as the efficacy of using converging measures to assess such understanding. Infants as young as 4 months appear to discriminate and respond in meaningful ways to others' emotion expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
A new model of mental representation is applied to social cognition: the attractor field model. Using the model, the authors predicted and found a perceptual advantage but a memory disadvantage for faces displaying evaluatively congruent expressions. In Experiment 1, participants completed a same/different perceptual discrimination task involving morphed pairs of angry-to-happy Black and White faces. Pairs of faces displaying evaluatively incongruent expressions (i.e., happy Black, angry White) were more likely to be labeled as similar and were less likely to be accurately discriminated from one another than faces displaying evaluatively congruent expressions (i.e., angry Black, happy White). Experiment 2 replicated this finding and showed that objective discriminability of stimuli moderated the impact of attractor field effects on perceptual discrimination accuracy. In Experiment 3, participants completed a recognition task for angry and happy Black and White faces. Consistent with the attractor field model, memory accuracy was better for faces displaying evaluatively incongruent expressions. Theoretical and practical implications of these findings are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号