首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 375 毫秒
1.
Traditional models of face processing posit independent pathways for the processing of facial identity and facial expression (e.g., Bruce & Young, 1986). However, such models have been questioned by recent reports that suggest positive expressions may facilitate recognition (e.g., Baudouin et al., 2000), although little attention has been paid to the role of negative expressions. The current study used eye movement indicators to examine the influence of emotional expression (angry, happy, neutral) on the recognition of famous and novel faces. In line with previous research, the authors found some evidence that only happy expressions facilitate the processing of famous faces. However, the processing of novel faces was enhanced by the presence of an angry expression. Contrary to previous findings, this paper suggests that angry expressions also have an important role in the recognition process, and that the influence of emotional expression is modulated by face familiarity. The implications of this finding are discussed in relation to (1) current models of face processing, and (2) theories of oculomotor control in the viewing of facial stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Visual short-term memory (VSTM) is limited, especially for complex objects. Its capacity, however, is greater for faces than for other objects; this advantage may stem from the holistic nature of face processing. If the holistic processing explains this advantage, object expertise--which also relies on holistic processing--should endow experts with a VSTM advantage. The authors compared VSTM for cars among car experts and car novices. Car experts, but not car novices, demonstrated a VSTM advantage similar to that for faces; this advantage was orientation specific and was correlated with an individual's level of car expertise. Control experiments ruled out accounts based solely on verbal- or long-term memory representations. These findings suggest that the processing advantages afforded by visual expertise result in domain-specific increases in VSTM capacity, perhaps by allowing experts to maximize the use of an inherently limited VSTM system. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Reports an error in "Facial expressions of emotion influence memory for facial identity in an automatic way" by Arnaud D'Argembeau and Martial Van der Linden (Emotion, 2007[Aug], Vol 7[3], 507-515). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum. (The following abstract of the original article appeared in record 2007-11660-005.) Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
[Correction Notice: An erratum for this article was reported in Vol 7(4) of Emotion (see record 2007-17748-022). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum.] Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
The authors examined face perception models with regard to the functional and temporal organization of facial identity and expression analysis. Participants performed a manual 2-choice go/no-go task to classify faces, where response hand depended on facial familiarity (famous vs. unfamiliar) and response execution depended on facial expression (happy vs. angry). Behavioral and electrophysiological markers of information processing—in particular, the lateralized readiness potential (LRP)—were recorded to assess the time course of facial identity and expression processing. The duration of facial identity and expression processes was manipulated in separate experiments, which allowed testing the differential predictions of alternative face perception models. Together, the reaction time and LRP findings indicate a parallel architecture of facial identity and expression analysis in which the analysis of facial expression relies on information about identity. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
We investigated age differences in biased recognition of happy, neutral, or angry faces in 4 experiments. Experiment 1 revealed increased true and false recognition for happy faces in older adults, which persisted even when changing each face’s emotional expression from study to test in Experiment 2. In Experiment 3, we examined the influence of reduced memory capacity on the positivity-induced recognition bias, which showed the absence of emotion-induced memory enhancement but a preserved recognition bias for positive faces in patients with amnestic mild cognitive impairment compared with older adults with normal memory performance. In Experiment 4, we used semantic differentials to measure the connotations of happy and angry faces. Younger and older participants regarded happy faces as more familiar than angry faces, but the older group showed a larger recognition bias for happy faces. This finding indicates that older adults use a gist-based memory strategy based on a semantic association between positive emotion and familiarity. Moreover, older adults’ judgments of valence were more positive for both angry and happy faces, supporting the hypothesis of socioemotional selectivity. We propose that the positivity-induced recognition bias might be based on fluency, which in turn is based on both positivity-oriented emotional goals and on preexisting semantic associations. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
The authors examined the organization of visual short-term memory (VSTM). Using a change-detection task, they reported that VSTM stores relational information between individual items. This relational processing is mediated by the organization of items into spatial configurations. The spatial configuration of visual objects is important for VSTM of spatial locations, colors, and shapes. When color VSTM is compared with location VSTM, spatial configuration plays an integral role because configuration is important for color VSTM, whereas color is not important for location VSTM. The authors also examined the role of attention and found that the formation of configuration is modulated by both top-down and bottom-up attentional factors. In summary, the authors proposed that VSTM stores the relational information of individual visual items on the basis of global spatial configuration. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Studies have found that older compared with young adults are less able to identify facial expressions and have worse memory for negative than for positive faces, but those studies have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age differences in processing faces may not extend to older faces, and preferential memory for own age faces may not extend to emotional faces. To investigate these possibilities, young and older participants viewed young and older faces presented either with happy, angry, or neutral expressions; participants identified the expressions displayed and then completed a surprise face recognition task. Older compared with young participants were less able to identify expressions of angry young and older faces and (based on participants’ categorizations) remembered angry faces less well than happy faces. There was no evidence of an own age bias in memory, but self-reported frequency of contact with young and older adults and awareness of own emotions played a role in expression identification of and memory for young and older faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

10.
Visual short-term memory (VSTM) has received intensive study over the past decade, with research focused on VSTM capacity and representational format. Yet, the function of VSTM in human cognition is not well understood. Here, the authors demonstrate that VSTM plays an important role in the control of saccadic eye movements. Intelligent human behavior depends on directing the eyes to goal-relevant objects in the world, yet saccades are very often inaccurate and require correction. The authors hypothesized that VSTM is used to remember the features of the current saccade target so that it can be rapidly reacquired after an errant saccade, a task faced by the visual system thousands of times each day. In 4 experiments, memory-based gaze correction was accurate, fast, automatic, and largely unconscious. In addition, a concurrent VSTM load interfered with memory-based gaze correction, but a verbal short-term memory load did not. These findings demonstrate that VSTM plays a direct role in a fundamentally important aspect of visually guided behavior, and they suggest the existence of previously unknown links between VSTM representations and the occulomotor system. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
12.
Controversy surrounding dissociative identity disorder (DID) has focused on conflicting findings regarding the validity and nature of interidentity amnesia, illustrating the need for objective methods of examining amnesia that can discriminate between explicit and implicit memory transfer. In the present study, the authors used a cross-modal manipulation designed to mitigate implicit memory effects. Explicit memory transfer between identities was examined in 7 DID participants and 34 matched control participants. After words were presented to one identity auditorily, the authors tested another identity for memory of those words in the visual modality using an exclusion paradigm. Despite self-reported interidentity amnesia, memory for experimental stimuli transferred between identities. DID patients showed no superior ability to compartmentalize information, as would be expected with interidentity amnesia. The cross-modal nature of the test makes it unlikely that memory transfer was implicit. These findings demonstrate that subjective reports of interidentity amnesia are not necessarily corroborated by objective tests of explicit memory transfer. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as “threat-related,” because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

14.
This study examined the impact of perceptual load on the processing of unattended threat-relevant faces. Participants performed a central letter-classification task while ignoring irrelevant face distractors, which appeared above or below the central task. The face distractors were graded for affective salience by means of aversive fear conditioning, with a conditioned angry face (CS+), an unconditioned angry face (CS?), and a neutral control face. The letter-classification task was presented under conditions of both low and high perceptual load. Results showed that fear conditioned (i.e., CS+) angry face distractors interfered with task performance more than CS? angry or neutral face distractors but that this interference was completely eliminated by high perceptual load. These findings demonstrate that aversively conditioned face distractors capture attention only under conditions of low perceptual load. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Decision making is influenced by social cues, but there is little understanding of how social information interacts with other cues that determine decisions. To address this quantitatively, participants were asked to learn which of two faces was associated with a higher probability of reward. They were repeatedly presented with two faces, each with a different, unknown probability of reward, and participants attempted to maximize gains by selecting the face that was most often rewarded. Both faces had the same identity, but one face had a happy expression and the other had either an angry or a sad expression. Ideal observer models predict that the facial expressions should not affect the decision-making process. Our results however showed that participants had a prior disposition to select the happy face when it was paired with the angry but not the sad face and overweighted the positive outcomes associated with happy faces and underweighted positive outcomes associated with either angry or sad faces. Nevertheless, participants also integrated the feedback information. As such, their decisions were a composite of social and utilitarian factors. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
This study investigated whether the White racial identity statuses proposed by J. E. Helms (1984, 1990, 1995) could explain individual differences in how racial stereotypes influence memory for race-related information as measured by memory sensitivity and response bias on a recognition memory task. Participants were 197 White undergraduate and graduate students who read 3 stimulus paragraphs embedded with Black and White stereotypical items. The race of the target character in the stimulus was randomly reported to be Black or White. After a 1-week interval, participants completed a measure of recognition memory, as well as a measure of White racial identity attitudes. Results offer support for the hypothesis that the White racial identity statuses influence how racial stereotypes affect information processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
In visual search tasks, if a set of items is presented for 1 s before another set of new items (containing the target) is added, search can be restricted to the new set. The process that eliminates old items from search is visual marking. This study investigates the kind of memory that distinguishes the old items from the new items during search. Using an accuracy paradigm in which perfect marking results in 100% accuracy and lack of marking results in near chance performance, the authors show that search can be restricted to new items not by visual short-term memory (VSTM) of old locations but by a limited capacity and slow-decaying VSTM of new locations and a high capacity and fast-decaying memory for asynchrony. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Event-related potentials were used to examine the recognition of happy and angry faces by 4- to 6-year-old children. In 2 experiments, Ss viewed 100-ms presentations of a happy face and an angry face posed by a single model. The frequency with which these expressions were presented varied across experiments, and which face served as the target or nontarget stimulus varied within experiments. In Experiment 1, an early negative component (N400) was observed that distinguished between the 2 expressions, and a 2nd, later positive component (P700) was observed that distinguished between target and nontarget events. In Experiment 2, these components were again observed, although both now distinguished only between low- and high-probability events. Both were absent at posterior scalp, were most prominent at parietal and central scalp, and were minimal at frontal scalp. These results are discussed in the context of children's allocation of attentional and memory resources for briefly presented affective stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Neuroimaging data suggest that emotional information, especially threatening faces, automatically captures attention and receives rapid processing. While this is consistent with the majority of behavioral data, behavioral studies of the attentional blink (AB) additionally reveal that aversive emotional first target (T1) stimuli are associated with prolonged attentional engagement or “dwell” time. One explanation for this difference is that few AB studies have utilized manipulations of facial emotion as the T1. To address this, schematic faces varying in expression (neutral, angry, happy) served as the T1 in the current research. Results revealed that the blink associated with an angry T1 face was, primarily, of greater magnitude than that associated with either a neutral or happy T1 face, and also that initial recovery from this processing bias was faster following angry, compared with happy, T1 faces. The current data therefore provide important information regarding the time-course of attentional capture by angry faces: Angry faces are associated with both the rapid capture and rapid release of attention. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

20.
In 6 experiments, the authors investigated whether attention orienting by gaze direction is modulated by the emotional expression (neutral, happy, angry, or fearful) on the face. The results showed a clear spatial cuing effect by gaze direction but no effect by facial expression. In addition, it was shown that the cuing effect was stronger with schematic faces than with real faces, that gaze cuing could be achieved at very short stimulus onset asynchronies (14 ms), and that there was no evidence for a difference in the strength of cuing triggered by static gaze cues and by cues involving apparent motion of the pupils. In sum, the results suggest that in normal, healthy adults, eye direction processing for attention shifts is independent of facial expression analysis. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号