首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 957 毫秒
1.
Accuracy in the incidental recall of photographed human faces can be predicted from the S's cognitive styles and biases: (a) Ss who were field dependent on an embedded-figures test recalled more faces correctly than did the field independent; (b) Ss who were narrow categorizers on the Pettigrew Category-Width Scale had better recall than had broad categorizers; and (c) Ss who thought the photographed persons were relatively young did better than those who thought they were older. These 3 kinds of stylistic consistency were mutually independent. Some of these styles may determine memory for all sorts of stimuli and some may be relatively specific to memory for faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Event-related potentials were used to determine whether infants, like adults, show differences in spatial and temporal characteristics of brain activation during face and object recognition. Three aspects of visual processing were identified: (a) differentiation of face vs. object (P400 at occipital electrode was shorter latency for faces), (b) recognition of familiar identity (Nc, or negative component, at frontotemporal electrodes [FTEs] was of larger amplitude for familiar stimuli), and (c) encoding novelty (slow wave at FTEs was larger for unfamiliar stimuli). The topography of the Nc was influenced by category type: Effects of familiarity were limited to the midline and right anterior temporal electrodes for faces but extended to all temporal electrodes for objects. Results show that infants' experience with specific examples within categories and their general category knowledge influence the neural correlates of visual processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Visual search has been studied extensively, yet little is known about how its constituent processes affect subsequent emotional evaluation of searched-for and searched-through items. In 3 experiments, the authors asked observers to locate a colored pattern or tinted face in an array of other patterns or faces. Shortly thereafter, either the target or a distractor was rated on an emotional scale (patterns, cheerfulness; faces, trustworthiness). In general, distractors were rated more negatively than targets. Moreover, distractors presented near the target during search were rated significantly more negatively than those presented far from the target. Target-distractor proximity affected distractor ratings following both simple-feature and difficult-conjunction search, even when items appeared at different locations during evaluation than during search and when faces previously tinted during search were presented in grayscale at evaluation. An attentional inhibition account is offered to explain these effects of attention on emotional evaluation. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
Visual short-term memory (VSTM) is limited, especially for complex objects. Its capacity, however, is greater for faces than for other objects; this advantage may stem from the holistic nature of face processing. If the holistic processing explains this advantage, object expertise--which also relies on holistic processing--should endow experts with a VSTM advantage. The authors compared VSTM for cars among car experts and car novices. Car experts, but not car novices, demonstrated a VSTM advantage similar to that for faces; this advantage was orientation specific and was correlated with an individual's level of car expertise. Control experiments ruled out accounts based solely on verbal- or long-term memory representations. These findings suggest that the processing advantages afforded by visual expertise result in domain-specific increases in VSTM capacity, perhaps by allowing experts to maximize the use of an inherently limited VSTM system. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Previous binocular rivalry studies with younger adults have shown that emotional stimuli dominate perception over neutral stimuli. Here we investigated the effects of age on patterns of emotional dominance during binocular rivalry. Participants performed a face/house rivalry task where the emotion of the face (happy, angry, neutral) and orientation (upright, inverted) of the face and house stimuli were varied systematically. Age differences were found with younger adults showing a general emotionality effect (happy and angry faces were more dominant than neutral faces) and older adults showing inhibition of anger (neutral faces were more dominant than angry faces) and positivity effects (happy faces were more dominant than both angry and neutral faces). Age differences in dominance patterns were reflected by slower rivalry rates for both happy and angry compared to neutral face/house pairs in younger adults, and slower rivalry rates for happy compared to both angry and neutral face/house pairs in older adults. Importantly, these patterns of emotional dominance and slower rivalry rates for emotional-face/house pairs disappeared when the stimuli were inverted. This suggests that emotional valence, and not low-level image features, were responsible for the emotional bias in both age groups. Given that binocular rivalry has a limited role for voluntary control, the findings imply that anger suppression and positivity effects in older adults may extend to more automatic tasks. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

6.
Seven experiments investigated the finding that threatening schematic faces are detected more quickly than nonthreatening faces. Threatening faces with v-shaped eyebrows (angry and scheming expressions) were detected more quickly than nonthreatening faces with A-shaped eyebrows (happy and sad expressions). In contrast to the hypothesis that these effects were due to perceptual features unrelated to the face, no advantage was found for v-shaped eyebrows presented in a nonfacelike object. Furthermore, the addition of internal facial features (the eyes, or the nose and mouth) was necessary to produce the detection advantage for faces with v-shaped eyebrows. Overall, the results are interpreted as showing that the v-shaped eyebrow configuration affords easy detection, but only when other internal facial features are present. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Adult-like attentional biases toward fearful faces can be observed in 7-month-old infants. It is possible, however, that infants merely allocate attention to simple features such as enlarged fearful eyes. In the present study, 7-month-old infants (n = 15) were first shown individual emotional faces to determine their visual scanning patterns of the expressions. Second, an overlap task was used to examine the latency of attention disengagement from centrally presented faces. In both tasks, the stimuli were fearful, happy, and neutral facial expressions, and a neutral face with fearful eyes. Eye-tracking data from the first task showed that infants scanned the eyes more than other regions of the face; however, there were no differences in scanning patterns across expressions. In the overlap task, infants were slower in disengaging attention from fearful as compared to happy and neutral faces and also to neutral faces with fearful eyes. Together, these results provide evidence that threat-related stimuli tend to hold attention preferentially in 7-month-old infants and that the effect does not reflect a simple response to differentially salient eyes in fearful faces. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Magnetoencephalography was used to examine the effect of individual differences on the temporal dynamics of emotional face processing by grouping subjects based on their ability to detect masked valence-laden stimuli. Receiver operating characteristic curves and a nonparametric sensitivity measure were used to categorize subjects into those that could and could not reliably detect briefly presented fearful faces that were backward-masked by neutral faces. Results showed that, in a cluster of face-responsive sensors, the strength of the M170 response was modulated by valence only when subjects could reliably detect the masked fearful faces. Source localization of the M170 peak using synthetic aperture magnetometry identified sources in face processing areas such as right middle occipital gyrus and left fusiform gyrus that showed the valence effect for those target durations at which subjects were sensitive to the fearful stimulus. Subjects who were better able to detect fearful faces also showed higher trait anxiety levels. These results suggest that individual differences between subjects, such as trait anxiety levels and sensitivity to fearful stimuli, may be an important factor to consider when studying emotion processing. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
The purpose of this investigation was to determine if the relations among the primitives used in face identification and in basic-level object recognition are represented using coordinate or categorical relations. In 2 experiments the authors used photographs of famous people's faces as stimuli in which each face had been altered to have either 1 of its eyes moved up from its normal position or both of its eyes moved up. Participants performed either a face identification task or a basic-level object recognition task with these stimuli. In the face identification task, 1-eye-moved faces were easier to recognize than 2-eyes-moved faces, whereas the basic-level object recognition task showed the opposite pattern of results. Results suggest that face identification involves a coordinate shape representation in which the precise locations of visual primitives are specified, whereas basic-level object recognition uses categorically coded relations. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Identification of other people's emotion from quickly presented stimuli, including facial expressions, is fundamental to many social processes, including rapid mimicry and empathy. This study examined extraction of valence from brief emotional expressions in adults with autism spectrum disorder (ASD), a condition characterized by impairments in understanding and sharing of emotions. Control participants were individuals with reading disability and typical individuals. Participants were shown images for durations in the range of microexpressions (15 ms and 30 ms), thus reducing the reliance on higher level cognitive skills. Participants detected whether (a) emotional faces were happy or angry, (b) neutral faces were male or female, and (c) neutral images were animals or objects. Individuals with ASD performed selectively worse on emotion extraction, with no group differences for gender or animal?object tasks. The emotion extraction deficit remains even when controlling for gender, verbal ability, and age and is not accounted for by speed-accuracy tradeoffs. The deficit in rapid emotional processing may contribute to ASD difficulties in mimicry, empathy, and related processes. The results highlight the role of rapid early emotion processing in adaptive social?emotional functioning. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
An attempt was made to predict from imaginative measures of the affiliation motive the frequency with which S selects human faces from similar but nonhuman figures in a perceptual task. The Ss were 93 male undergraduates who responded to pictures with imaginative stories scored for n Affiliation. A month later they were introduced to the perceptual task which required that they state which of 4 figures flashed on a screen was clearest, all stimuli being below the recognition threshold. On each trial 1 of the 4 stimuli was a face and the others were similar but affiliation-neutral. Ss high in n Affiliation recognized faces significantly more frequently than those low in n Affiliation. Thus, the predicted relationship between motivation and the perceptual selection of motive-relevant stimuli was supported. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
A central question in perception is how stimuli are selected for access to awareness. This study investigated the impact of emotional meaning on detection of faces using the attention blink paradigm. Experiment 1 showed that fearful faces were detected more frequently than neutral faces, and Experiment 2 revealed preferential detection of fearful faces compared with happy faces. To rule out image artifacts as a cause for these results, Experiment 3 manipulated the emotional meaning of neutral faces through fear conditioning and showed a selective increase in detection of conditioned faces. These results extend previous reports of preferential detection of emotional words or schematic objects and suggest that fear conditioning can modulate detection of formerly neutral stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
The authors tested the role of individual items in recognition memory using a forced-choice paradigm with face stimuli. They constructed distractor stimuli using morphing procedures that were similar to two parent faces and then compared a studied morph against an unstudied morph that was similar to two studied parents. The similarity of the parent faces was carefully balanced so that the choosing rates for the studied and unstudied morphs were approximately equal. Despite being equally likely to choose the studied and the unstudied morph, participants were systematically more confident when choosing the studied morph. This result is incompatible with Gaussian signal detection theory, even with unequal variances for targets and distractors. The authors propose an extension of an extant sampling model, SimSample, which provides a qualitative and quantitative account of the confidence and recognition dissociation. The results suggest that observers make contact with individual items when making recognition judgments with faces and that the structure of the sampling and decision process naturally leads to this dissociation of confidence and recognition. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
The communicative abilities of infants, revealed by the still-face (SF) procedure, were examined in 2 studies comparing behavior toward people and "interactive" objects. In Exp 1, 32 3- and 6-mo-olds were presented with an object and a person (mother or female stranger). The SF effect was produced only by mothers and strangers. Positive affect clearly established person–object differentiation; infants smiled at people but rarely smiled at the object. In Exp 2, 12 3-mo-olds were presented with 4 stimuli: a female stranger and 3 objects with features varying in similarity to an abstract, smiling face. Again, infants reserved their smiles for the person. Positive affect appears to be a primary index of young infants' social–perceptual competence and person–object differentiation. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
The anger-superiority hypothesis states that angry faces are detected more efficiently than friendly faces. Previously research used schematized stimuli, which minimizes perceptual confounds, but violates ecological validity. The authors argue that a confounding of appearance and meaning is unavoidable and even unproblematic if real faces are presented. Four experiments tested carefully controlled photos in a search-asymmetry design. Experiments 1 and 2 revealed more efficient detection of an angry face among happy faces than vice versa. Experiment 3 indicated that the advantage was due to the mouth, but not to the eyes, and Experiment 4, using upright and inverted thatcherized faces, suggests a perceptual basis. The results are in line with a sensory-bias hypothesis that facial expressions evolved to exploit extant capabilities of the visual system. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Distinctiveness contributes strongly to the recognition and rejection of faces in memory tasks. In four experiments we examine the role played by local and relational information in the distinctiveness of upright and inverted faces. In all experiments subjects saw one of three versions of a face: original faces, which had been rated as average in distinctiveness in a previous study (Hancock, Burton, & Bruce, 1996), a more distinctive version in which local features had been changed (D-local), and a more distinctive version in which relational features had been changed (D-rel). An increase in distinctiveness was found for D-local and D-rel faces in Experiment 1 (complete faces) and 3 and 4 (face internals only) when the faces had to be rated in upright presentation, but the distinctiveness of the D-rel faces was reduced much more than that of the D-local versions when the ratings were given to the faces presented upside-down (Experiments 1 and 3). Recognition performance showed a similar pattern: presented upright, both D-local and D-rel revealed higher performance compared to the originals, but in upside-down presentation the D-local versions showed a much stronger distinctiveness advantage. When only internal features of faces were used (Experiments 3 and 4), the D-rel faces lost their advantage over the Original versions in inverted presentation. The results suggest that at least two dimensions of facial information contribute to a face's apparent distinctiveness, but that these sources of information are differentially affected by turning the face upside-down. These findings are in accordance with a face processing model in which face inversion effects occur because a specific type of information processing is disrupted, rather than because of a general disruption of performance.  相似文献   

17.
This research examined the relationship between facial immaturity and the perception of youthfulness, helplessness, and cuteness. In the first study, college students rated 16 faces for youthfulness. Faces varied within four dimensions (eye position, eye size, nose length, and shape of chin) representing either a mature or immature feature. College students rated faces conserving immature features as more youthful than those without those features. In the second study, three groups of children (5 to 8, 9 to 12, and 13 to 16 years old) rated the same 16 faces with respect to cuteness, helplessness, and youthfulness. Children were similar with respect to their attention to immature features when evaluating faces for youthful qualities, although older children were more sensitive to eye position than younger children when rating faces for youthfulness and helplessness. Older children were more consistent in their attention to immature features when rating faces.  相似文献   

18.
The “face in the crowd effect” refers to the finding that threatening or angry faces are detected more efficiently among a crowd of distractor faces than happy or nonthreatening faces. Work establishing this effect has primarily utilized schematic stimuli and efforts to extend the effect to real faces have yielded inconsistent results. The failure to consistently translate the effect from schematic to human faces raises questions about its ecological validity. The present study assessed the face in the crowd effect using a visual search paradigm that placed veridical faces, verified to exemplify prototypical emotional expressions, within heterogeneous crowds. Results confirmed that angry faces were found more quickly and accurately than happy expressions in crowds of both neutral and emotional distractors. These results are the first to extend the face in the crowd effect beyond homogenous crowds to more ecologically valid conditions and thus provide compelling evidence for its legitimacy as a naturalistic phenomenon. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Three experiments examined whether famous faces would be affected by the age at which knowledge of the face was first acquired (AoA). Using a multiple regression design, Experiment 1 showed that rated familiarity and AoA were significant predictors of the time required to name pictures of celebrities' faces and the accuracy of producing their names. Experiment 2 replicated an effect of AoA using a factorial design in which other attributes of the celebrities were matched. In both Experiments 1 and 2, several ratings had been collected from participants before naming latency data were collected. Experiment 3 investigated the accuracy and latency of naming celebrities without any prior exposure to the stimuli. An advantage for naming early acquired celebrities was observed even on the first presentation. The participants named the same celebrities in three subsequent presentations of the stimuli. The effect of AoA was not significant on the fourth presentation. The implications of these results for models of face naming and directions for future research are discussed.  相似文献   

20.
Neuroimaging data suggest that emotional information, especially threatening faces, automatically captures attention and receives rapid processing. While this is consistent with the majority of behavioral data, behavioral studies of the attentional blink (AB) additionally reveal that aversive emotional first target (T1) stimuli are associated with prolonged attentional engagement or “dwell” time. One explanation for this difference is that few AB studies have utilized manipulations of facial emotion as the T1. To address this, schematic faces varying in expression (neutral, angry, happy) served as the T1 in the current research. Results revealed that the blink associated with an angry T1 face was, primarily, of greater magnitude than that associated with either a neutral or happy T1 face, and also that initial recovery from this processing bias was faster following angry, compared with happy, T1 faces. The current data therefore provide important information regarding the time-course of attentional capture by angry faces: Angry faces are associated with both the rapid capture and rapid release of attention. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号