首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 921 毫秒
1.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

2.
Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Two studies examined the general prediction that one's emotional expression should facilitate memory for material that matches the expression. The authors focused on specific facial expressions of surprise. In the first study, participants who were mimicking a surprised expression showed better recall for the surprising words and worse recall for neutral words, relative to those who were mimicking a neutral expression. Study 2 replicated the results of Study 1, showing that participants who mimicked a surprised expression recalled more words spoken in a surprising manner compared with those that sounded neutral or sad. Conversely, participants who mimicked sad facial expressions showed greater recall for sad than neutral or surprising words. The results provide evidence of the importance of matching the emotional valence of the recall content to the facial expression of the recaller during the memorization period. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Although it was proposed over a century ago that feedback from facial expressions influence emotional experience, tests of this hypothesis have been equivocal. Here we directly tested this facial feedback hypothesis (FFH) by comparing the impact on self-reported emotional experience of BOTOX injections (which paralyze muscles of facial expression) and a control Restylane injection (which is a cosmetic filler that does not affect facial muscles). When examined alone, BOTOX participants showed no pre- to posttreatment changes in emotional responses to our most positive and negative video clips. Between-groups comparisons, however, showed that relative to controls, BOTOX participants exhibited an overall significant decrease in the strength of emotional experience. This result was attributable to (a) a pre- versus postdecrease in responses to mildly positive clips in the BOTOX group and (b) an unexpected increase in responses to negative clips in the Restylane control group. These data suggest that feedback from facial expressions is not necessary for emotional experience, but may influence emotional experience in some circumstances. These findings point to specific directions for future work clarifying the expression-experience relationship. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Decoding facial expressions of emotion is an important aspect of social communication that is often impaired following psychiatric or neurological illness. However, little is known of the cognitive components involved in perceiving emotional expressions. Three dual task studies explored the role of verbal working memory in decoding emotions. Concurrent working memory load substantially interfered with choosing which emotional label described a facial expression (Experiment 1). A key factor in the magnitude of interference was the number of emotion labels from which to choose (Experiment 2). In contrast the ability to decide that two faces represented the same emotion in a discrimination task was relatively unaffected by concurrent working memory load (Experiment 3). Different methods of assessing emotion perception make substantially different demands on working memory. Implications for clinical disorders which affect both working memory and emotion perception are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as “threat-related,” because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

7.
Reports an error in "Facial expressions of emotion influence memory for facial identity in an automatic way" by Arnaud D'Argembeau and Martial Van der Linden (Emotion, 2007[Aug], Vol 7[3], 507-515). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum. (The following abstract of the original article appeared in record 2007-11660-005.) Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
[Correction Notice: An erratum for this article was reported in Vol 7(4) of Emotion (see record 2007-17748-022). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum.] Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
The appraisal process consists of the subjective evaluation that occurs during an individual's encounter with significant events in the environment, determining the nature of the emotional reaction and experience. Placed in the context of appraisal theories of emotion-elicitation and differentiation, the aim of the present research was to test empirically the hypothesis that the intrinsic pleasantness evaluation occurs before the goal conduciveness evaluation. In two studies, intrinsically pleasant and unpleasant images were used to manipulate pleasantness, and a specific event in a Pacman-type videogame was used to manipulate goal conduciveness. Facial EMG was used to measure facial reactions to each evaluation. As predicted, facial reactions to the intrinsic pleasantness manipulation were faster than facial reactions to the goal conduciveness manipulation. These results provide good empirical support for the sequential nature of the appraisal process. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
J. B. Halberstadt and P. M. Niedenthal (2001) reported that explanations of target individuals' emotional states biased memory for their facial expressions in the direction of the explanation. The researchers argued for, but did not test, a 2-stage model of the explanation effect, such that verbal explanation increases attention to facial features at the expense of higher level featural configuration, making the faces vulnerable to conceptual reintegration in terms of available emotion categories. The current 4 experiments provided convergent evidence for the "featural shift" hypothesis by examining memory for both faces and facial features following verbal explanation. Featural attention was evidenced by verbalizers' better memory for features relative to control participants and reintegration by a weaker explanation bias for features and configurally altered faces than for whole, unaltered faces. The results have implications for emotion, attribution, language, and the interaction of implicit and explicit processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
In a test of the effects of cortisol on emotional memory, 90 men were orally administered placebo or 20 or 40 mg cortisol and presented with emotionally arousing and neutral stimuli. On memory tests administered within 1 hr of stimulus presentation, cortisol elevations caused a reduction in the number of errors committed on free-recall tasks. Two evenings later, when cortisol levels were no longer manipulated, inverted-U quadratic trends were found for recognition memory tasks, reflecting memory facilitation in the 20-mg group for both negative and neutral information. Results suggest that the effects of cortisol on memory do not differ substantially for emotional and neutral information. The study provides evidence of beneficial effects of acute cortisol elevations on explicit memory in humans. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
This study examined slow wave (SW) event-related brain potential (ERP) amplitudes in response to happy, neutral, and sad faces during a working memory task to further identify the associated component processes and physiological changes of mood-congruent memory biases in individuals with and without major depression. The results suggest that individuals with and without a diagnosis of major depressive disorder (MDD) differentially maintain valenced facial information in their working memory. Specifically, the nondepressed individuals displayed a marked reduction in SW amplitude to the negative faces. Individuals with MDD exhibited equivalent SW amplitudes for positive and negative facial stimuli. Results are discussed in terms of avoidance coping, previous ERP studies of working memory, and facial recognition deficits in individuals with MDD. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
14.
Previous studies have shown that visual attention can be captured by stimuli matching the contents of working memory (WM). Here, the authors assessed the nature of the representation that mediates the guidance of visual attention from WM. Observers were presented with either verbal or visual primes (to hold in memory, Experiment 1; to verbalize, Experiment 2; or merely to attend, Experiment 3) and subsequently were required to search for a target among different distractors, each embedded within a colored shape. In half of the trials, an object in the search array matched the prime, but this object never contained the target. Despite this, search was impaired relative to a neutral baseline in which the prime and search displays did not match. An interesting finding is that verbal primes were effective in generating the effects, and verbalization of visual primes elicited similar effects to those elicited when primes were held in WM. However, the effects were absent when primes were only attended. The data suggest that there is automatic encoding into WM when items are verbalized and that verbal as well as visual WM can guide visual attention. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Prior studies provide consistent evidence of deficits for psychopaths in processing verbal emotional material but are inconsistent regarding nonverbal emotional material. To examine whether psychopaths exhibit general versus specific deficits in nonverbal emotional processing, 34 psychopaths and 33 nonpsychopaths identified with Hare's (R. D. Hare, 1991) Psychopathy Checklist-Revised were asked to complete a facial affect recognition test. Slides of prototypic facial expressions were presented. Three hypotheses regarding hemispheric lateralization anomalies in psychopaths were also tested (right-hemisphere dysfunction, reduced lateralization, and reversed lateralization). Psychopaths were less accurate than nonpsychopaths at classifying facial affect under conditions promoting reliance on right-hemisphere resources and displayed a specific deficit in classifying disgust. These findings demonstrate that psychopaths exhibit specific deficits in nonverbal emotional processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
The authors tested source memory across three conditions, one in which 3 strongly associated primes of a target word were presented in the same source as the target, one in which primes were presented in a different source than the target, and one in which no associates of targets were encoded. In the first 2 experiments, target source memory increased in the same-prime condition and decreased in the different-prime condition relative to the no-prime condition. In Experiment 3, the different-prime condition created the illusion that target words had been presented in both sources at encoding. The MINERVA 2 model (D. L. Hintzman, 1988) was able to predict these effects by basing source decisions on the global match of source-specific retrieval probes to all of the items in the memory set. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Results from 5 experiments provide converging evidence that automatic evaluation of faces in sequential priming paradigms reflects affective responses to phenotypic features per se rather than evaluation of the racial categories to which the faces belong. Experiment I demonstrates that African American facial primes with racially prototypic physical features facilitate more automatic negative evaluations than do other Black faces that are unambiguously categorizable as African American but have less prototypic features. Experiments 2, 3, and 4 further support the hypothesis that these differences reflect direct affective responses to physical features rather than differential categorization. Experiment 5 shows that automatic responses to facial primes correlate with cue-based but not category-based explicit measures of prejudice. Overall, these results suggest the existence of 2 distinct types of prejudice. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
We investigated the effect of subliminally presented happy or angry faces on evaluative judgments when the facial muscles of participants were free to mimic or blocked. We hypothesized and showed that subliminally presented happy expressions lead to more positive judgments of cartoons compared to angry expressions only when facial muscles were not blocked. These results reveal the influence of socially driven embodied processes on affective judgments and have also potential implications for phenomena such as emotional contagion. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

19.
Emotional-neutral pairs of visual scenes were presented peripherally (with their inner edges 5.2° away from fixation) as primes for 150 to 900 ms, followed by a centrally presented recognition probe scene, which was either identical in specific content to one of the primes or related in general content and affective valence. Results indicated that (a) if no foveal fixations on the primes were allowed, the false alarm rate for emotional probes was increased; (b) hit rate and sensitivity (A') were higher for emotional than for neutral probes only when a fixation was possible on only one prime; and (c) emotional scenes were more likely to attract the first fixation than neutral scenes. It is concluded that the specific content of emotional or neutral scenes is not processed in peripheral vision. Nevertheless, a coarse impression of emotional scenes may be extracted, which then leads to selective attentional orienting or--in the absence of overt attention--causes false alarms for related probes. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
The present study was designed to explore serial position and suffix effects in the short-term retention of nonverbal sounds. In contrast with previous studies of these effects, a probe recognition paradigm was used to minimize the possibility that participants would use a verbal labelling strategy. On each trial, participants heard a memory set consisting of three pure tones, followed five seconds later by a probe tone. Participants were required to indicate whether or not the probe tone had been a member of the memory set. On most trials, a suffix sound was presented 1 second following the third sound in the memory set. Results revealed that tones presented in the first and last positions of the memory set were recognized more accurately than were tones presented in the middle position. Furthermore, recognition of sounds presented in the last position was compromised when the memory set was followed by a postlist suffix of similar pitch, spectral composition, and spatial location. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号