首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Examined intermodal perception of vocal and facial expressions in 2 experiments with 16 5- and 16 7-mo-olds. Two filmed facial expressions were presented with a single vocal expression characteristic of 1 of the facial expressions (angry or happy). The lower third of each face was obscured, so Ss could not simply match lip movements to the voice. Overall findings indicate that only 7-mo-olds increased their fixation to a facial expression when it was sound-specified. Older infants evidently detected information that was invariant across the presentations of a single affective expression, despite degradation of temporal synchrony information. The 5-mo-olds' failure to look differentially is explained by the possibilities that (1) 5-mo-olds may need to see the whole face for any discrimination of expressions to occur; (2) they cannot discriminate films of happy and angry facial expressions even with the full face available; or (3) they rely heavily on temporal information for the discrimination of facial expressions and/or the intermodal perception of bimodally presented expressions, although not for articulatory patterns. Preferences for a particular expression were not found: Infants did not look longer at the happy or the angry facial expression, independent of the sound manipulation, suggesting that preferences for happy expressions found in prior studies may rest on attention to the "toothy" smile. (25 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
We investigated age differences in biased recognition of happy, neutral, or angry faces in 4 experiments. Experiment 1 revealed increased true and false recognition for happy faces in older adults, which persisted even when changing each face’s emotional expression from study to test in Experiment 2. In Experiment 3, we examined the influence of reduced memory capacity on the positivity-induced recognition bias, which showed the absence of emotion-induced memory enhancement but a preserved recognition bias for positive faces in patients with amnestic mild cognitive impairment compared with older adults with normal memory performance. In Experiment 4, we used semantic differentials to measure the connotations of happy and angry faces. Younger and older participants regarded happy faces as more familiar than angry faces, but the older group showed a larger recognition bias for happy faces. This finding indicates that older adults use a gist-based memory strategy based on a semantic association between positive emotion and familiarity. Moreover, older adults’ judgments of valence were more positive for both angry and happy faces, supporting the hypothesis of socioemotional selectivity. We propose that the positivity-induced recognition bias might be based on fluency, which in turn is based on both positivity-oriented emotional goals and on preexisting semantic associations. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

4.
An extensive literature credits the right hemisphere with dominance for processing emotion. Conflicting literature finds left hemisphere dominance for positive emotions. This conflict may be resolved by attending to processing stage. A divided output (bimanual) reaction time paradigm in which response hand was varied for emotion (angry; happy) in Experiments 1 and 2 and for gender (male; female) in Experiment 3 focused on response to emotion rather than perception. In Experiments 1 and 2, reaction time was shorter when right-hand responses indicated a happy face and left-hand responses an angry face, as compared to reversed assignment. This dissociation did not obtain with incidental emotion (Experiment 3). Results support the view that response preparation to positive emotional stimuli is left lateralized. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Event-related potentials were used to examine the recognition of happy and angry faces by 4- to 6-year-old children. In 2 experiments, Ss viewed 100-ms presentations of a happy face and an angry face posed by a single model. The frequency with which these expressions were presented varied across experiments, and which face served as the target or nontarget stimulus varied within experiments. In Experiment 1, an early negative component (N400) was observed that distinguished between the 2 expressions, and a 2nd, later positive component (P700) was observed that distinguished between target and nontarget events. In Experiment 2, these components were again observed, although both now distinguished only between low- and high-probability events. Both were absent at posterior scalp, were most prominent at parietal and central scalp, and were minimal at frontal scalp. These results are discussed in the context of children's allocation of attentional and memory resources for briefly presented affective stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
We investigated the ability of 7-month-olds to categorize the facial expressions happy, fear, and surprise when these expressions varied both by the model depicting the expression and by how intensely the expression was portrayed in a series of three experiments. In Experiment 1, infants successfully discriminated a single model posing a mild versus an extreme version of happy and fear. In Experiment 2, infants categorized happy when depicted by for different models posing mild and extreme versions and discriminated happy from fear. In Experiment 3, infants categorized both happy and surprise posed by five models varying in degree of expressiveness and discriminated these expressions from fear. In both Experiments 2 and 3, there was no evidence that infants could also (a) categorize the fear expressions and discriminate fear from happy or from surprise or (b) discriminate surprise from happy after habituating to surprise. These results are discussed in the context of the importance of experience in recognizing facial expressions and of how such experience influences the ease with which various expressions can be encoded and discriminated from other expressions in the laboratory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Converging data suggest that human facial behavior has an evolutionary basis. Combining these data with M. E. Seligman's (1970) preparedness theory, it was predicted that facial expressions of anger should be more readily associated with aversive events than should expressions of happiness. Two experiments involving differential electrodermal conditioning to pictures of faces, with electric shock as the unconditioned stimulus, were performed. In the 1st experiment, 32 undergraduates were exposed to 2 pictures of the same person, 1 with an angry and 1 with a happy expression. For half of the Ss, the shock followed the angry face, and for the other half, it followed the happy face. In the 2nd experiment, 3 groups of 48 undergraduates differentiated between pictures of male and female faces, both showing angry, neutral, and happy expressions. Responses to angry CSs showed significant resistance to extinction in both experiments, with a larger effect in Exp II. Responses to happy or neutral CSs, on the other hand, extinguished immediately when the shock was withheld. Results are related to conditioning to phobic stimuli and to the preparedness theory. (22 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
The holistic hypothesis in face processing was tested in 3 experiments. Holistic processing was conceptualized as interactive influence of facial features on the perceptual representation of faces. In Experiment 1, 3 facial features (eye distance, width of nose, size of mouth) were varied on 3 values per feature. Photographs and blurred versions were used. Participants assigned each stimulus face to 1 of 2 target faces according to similarity. The data were evaluated by the logit model that provides a direct test of interactive influence of the features on participants' performance. The interactive-processing hypothesis was not confirmed. The results were replicated in Experiment 2, in which 2 features with 5 values each were used and data of individual participants were evaluated, and in Experiment 3, in which a reduced presentation time of 250 ms was used. It is concluded that facial features are processed and represented independently.  相似文献   

9.
J. E. Cutting et al (see record 1993-00237-001) criticized the paradigm for inquiry and the fuzzy logical model of perception (FLMP) presented in D. W. Massaro (see record 1989-14292-001). In this reply to their remarks, it is shown that (1) the properties of the paradigm are ideal for inquiry; (2) models are best tested against the results of individual Ss and not average group data; (3) model fitting and ANOVA do not give contradictory results; (4) the FLMP can be proven false and does not have a superpower to predict a plethora of functions or to absorb random variability; and (5) various extraneous characteristics of a model, such as equation length, cannot account for the success of the FLMP. On the other hand, the empirical findings of Cutting et al give important new properties of pattern recognition. Finally, Cutting's theory of directed perception is compared with the FLMP. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
The anger-superiority hypothesis states that angry faces are detected more efficiently than friendly faces. Previously research used schematized stimuli, which minimizes perceptual confounds, but violates ecological validity. The authors argue that a confounding of appearance and meaning is unavoidable and even unproblematic if real faces are presented. Four experiments tested carefully controlled photos in a search-asymmetry design. Experiments 1 and 2 revealed more efficient detection of an angry face among happy faces than vice versa. Experiment 3 indicated that the advantage was due to the mouth, but not to the eyes, and Experiment 4, using upright and inverted thatcherized faces, suggests a perceptual basis. The results are in line with a sensory-bias hypothesis that facial expressions evolved to exploit extant capabilities of the visual system. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

12.
A new model of mental representation is applied to social cognition: the attractor field model. Using the model, the authors predicted and found a perceptual advantage but a memory disadvantage for faces displaying evaluatively congruent expressions. In Experiment 1, participants completed a same/different perceptual discrimination task involving morphed pairs of angry-to-happy Black and White faces. Pairs of faces displaying evaluatively incongruent expressions (i.e., happy Black, angry White) were more likely to be labeled as similar and were less likely to be accurately discriminated from one another than faces displaying evaluatively congruent expressions (i.e., angry Black, happy White). Experiment 2 replicated this finding and showed that objective discriminability of stimuli moderated the impact of attractor field effects on perceptual discrimination accuracy. In Experiment 3, participants completed a recognition task for angry and happy Black and White faces. Consistent with the attractor field model, memory accuracy was better for faces displaying evaluatively incongruent expressions. Theoretical and practical implications of these findings are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Two experiments competitively test 3 potential mechanisms (negativity inhibiting responses, feature-based accounts, and evaluative context) for the response latency advantage for recognizing happy expressions by investigating how the race of a target can moderate the strength of the effect. Both experiments indicate that target race modulates the happy face advantage, such that European American participants displayed the happy face advantage for White target faces, but displayed a response latency advantage for angry (Experiments 1 and 2) and sad (Experiment 2) Black target faces. This pattern of findings is consistent with an evaluative context mechanism and inconsistent with negativity inhibition and feature-based accounts of the happy face advantage. Thus, the race of a target face provides an evaluative context in which facial expressions are categorized. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
The authors examined face perception models with regard to the functional and temporal organization of facial identity and expression analysis. Participants performed a manual 2-choice go/no-go task to classify faces, where response hand depended on facial familiarity (famous vs. unfamiliar) and response execution depended on facial expression (happy vs. angry). Behavioral and electrophysiological markers of information processing—in particular, the lateralized readiness potential (LRP)—were recorded to assess the time course of facial identity and expression processing. The duration of facial identity and expression processes was manipulated in separate experiments, which allowed testing the differential predictions of alternative face perception models. Together, the reaction time and LRP findings indicate a parallel architecture of facial identity and expression analysis in which the analysis of facial expression relies on information about identity. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
General action and inaction concepts have been shown to produce broad, goal-mediated effects on cognitive and motor activity irrespective of the type of activity. The current research tested a model in which action and inaction goals interact with the valence of incidental moods to guide behavior. Over four experiments, participants' moods were manipulated to be positive (happy), neutral, or negative (angry or sad), and then general action, inaction, and neutral concepts were primed. In Experiment 1, action primes increased intellectual performance when participants experienced a positive (happy) or neutral mood, whereas inaction primes increased performance when participants experienced a negative (angry) mood. Including a control-prime condition, Experiments 2 and 3 replicated these results measuring the number of general interest articles participants were willing to read and participants' memory for pictures of celebrities. Experiment 4 replicated the results comparing happiness with sadness and suggested that the effect of the prime's adoption was automatic. Overall, the findings supported an interactive model by which action concepts and positive affect produce the same increases in active behavior as inaction concepts and negative affect. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

16.
The perception of visual aftereffects has been long recognized, and these aftereffects reveal a relationship between perceptual categories. Thus, emotional expression aftereffects can be used to map the categorical relationships among emotion percepts. One might expect a symmetric relationship among categories, but an evolutionary, functional perspective predicts an asymmetrical relationship. In a series of 7 experiments, the authors tested these predictions. Participants fixated on a facial expression, then briefly viewed a neutral expression, then reported the apparent facial expression of the 2nd image. Experiment 1 revealed that happy and sad are opposites of one another; each evokes the other as an aftereffect. The 2nd and 3rd experiments reveal that fixating on any negative emotions yields an aftereffect perceived as happy, whereas fixating on a happy face results in the perception of a sad aftereffect. This suggests an asymmetric relationship among categories. Experiments 4-7 explored the mechanism driving this effect. The evolutionary and functional explanations for the category asymmetry are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Many research reports have concluded that emotional information can be processed without observers being aware of it. The case for perception without awareness has almost always been made with the use of facial expressions. In view of the similarities between facial and bodily expressions for rapid perception and communication of emotional signals, we conjectured that perception of bodily expressions may also not necessarily require visual awareness. Our study investigates the role of visual awareness in the perception of bodily expressions using a backward masking technique in combination with confidence ratings on a trial-by-trial basis. Participants had to detect in three separate experiments masked fearful, angry and happy bodily expressions among masked neutral bodily actions as distractors and subsequently the participants had to indicate their confidence. The onset between target and mask (Stimulus Onset Asynchrony, SOA) varied from ?50 to +133 ms. Sensitivity measurements (d-prime) as well as the confidence of the participants showed that the bodies could be detected reliably in all SOA conditions. In an important finding, a lack of covariance was observed between the objective and subjective measurements when the participants had to detect fearful bodily expressions, yet this was not the case when participants had to detect happy or angry bodily expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

18.
The decrease in recognition performance after face inversion has been taken to suggest that faces are processed holistically. Three experiments, 1 with schematic and 2 with photographic faces, were conducted to assess whether face inversion also affected visual search for and implicit evaluation of facial expressions of emotion. The 3 visual search experiments yielded the same differences in detection speed between different facial expressions of emotion for upright and inverted faces. Threat superiority effects, faster detection of angry than of happy faces among neutral background faces, were evident in 2 experiments. Face inversion did not affect explicit or implicit evaluation of face stimuli as assessed with verbal ratings and affective priming. Happy faces were evaluated as more positive than angry, sad, or fearful/scheming ones regardless of orientation. Taken together these results seem to suggest that the processing of facial expressions of emotion is not impaired if holistic processing is disrupted. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Two experiments investigated the effects of sadness, anger, and happiness on 4- to 6-year-old children's memory and suggestibility concerning story events. In Experiment 1, children were presented with 3 interactive stories on a video monitor. The stories included protagonists who wanted to give the child a prize. After each story, the child completed a task to try to win the prize. The outcome of the child's effort was manipulated in order to elicit sadness, anger, or happiness. Children's emotions did not affect story recall, but children were more vulnerable to misleading questions about the stories when sad than when angry or happy. In Experiment 2, a story was presented and emotions were elicited using an autobiographical recall task. Children responded to misleading questions and then recalled the story for a different interviewer. Again, children's emotions did not affect the amount of story information recalled correctly, but sad children incorporated more information from misleading questions during recall than did angry or happy children. Sad children's greater suggestibility is discussed in terms of the differing problem-solving strategies associated with discrete emotions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
It has been suggested that despite explicit recognition difficulties, implicit processing of facial expressions may be preserved in older adulthood. To directly test this possibility, the authors used facial electromyography to assess older (N = 40) and young (N = 46) adults’ mimicry responses to angry and happy facial expressions, which were presented subliminally via a backward masking technique. The results indicated that despite not consciously perceiving the facial emotion stimuli, both groups mimicked the angry and happy facial expressions. Implications for emotion recognition difficulties in late adulthood are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号