首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 11 毫秒
1.
According to theories of emotion and attention, we are predisposed to orient rapidly toward threat. However, previous examination of attentional cueing by threat showed no enhanced capture at brief durations, a finding that may be related to the sensitivity of the manual response measure used. Here we investigated the time course of orienting attention toward fearful faces in the exogenous cueing task. Cue duration (20 ms or 100 ms) and response mode (saccadic or manual) were manipulated. In the saccade mode, both enhanced attentional capture and impaired disengagement from fearful faces were evident and limited to 20 ms, suggesting that saccadic cueing effects emerge rapidly and are short lived. In the manual mode, fearful faces impacted only upon the disengagement component of attention at 100 ms, suggesting that manual cueing effects emerge over longer periods of time. Importantly, saccades could reveal threat biases at brief cue durations consistent with current theories of emotion and attention. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Reports an error in "Facial expressions of emotion influence memory for facial identity in an automatic way" by Arnaud D'Argembeau and Martial Van der Linden (Emotion, 2007[Aug], Vol 7[3], 507-515). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum. (The following abstract of the original article appeared in record 2007-11660-005.) Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
A number of past studies have used the visual search paradigm to examine whether certain aspects of emotional faces are processed preattentively and can thus be used to guide attention. All these studies presented static depictions of facial prototypes. Emotional expressions conveyed by the movement patterns of the face have never been examined for their preattentive effect. The present study presented for the first time dynamic facial expressions in a visual search paradigm. Experiment 1 revealed efficient search for a dynamic angry face among dynamic friendly faces, but inefficient search in a control condition with static faces. Experiments 2 to 4 suggested that this pattern of results is due to a stronger movement signal in the angry than in the friendly face: No (strong) advantage of dynamic over static faces is revealed when the degree of movement is controlled. These results show that dynamic information can be efficiently utilized in visual search for facial expressions. However, these results do not generally support the hypothesis that emotion-specific movement patterns are always preattentively discriminated. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
[Correction Notice: An erratum for this article was reported in Vol 7(4) of Emotion (see record 2007-17748-022). The image printed for Figure 3 was incorrect. The correct image is provided in the erratum.] Previous studies indicate that the encoding of new facial identities in memory is influenced by the type of expression displayed by the faces. In the current study, the authors investigated whether or not this influence requires attention to be explicitly directed toward the affective meaning of facial expressions. In a first experiment, the authors found that facial identity was better recognized when the faces were initially encountered with a happy rather than an angry expression, even when attention was oriented toward facial features other than expression. Using the Remember/Know/Guess paradigm in a second experiment, the authors found that the influence of facial expressions on the conscious recollection of facial identity was even more pronounced when participants' attention was not directed toward expressions. It is suggested that the affective meaning of facial expressions automatically modulates the encoding of facial identity in memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Three experiments tested the hypothesis that explaining emotional expressions using specific emotion concepts at encoding biases perceptual memory for those expressions. In Experiment 1, participants viewed faces expressing blends of happiness and anger and created explanations of why the target people were expressing one of the two emotions, according to concepts provided by the experimenter. Later, participants attempted to identify the facial expressions in computer movies, in which the previously seen faces changed continuously from anger to happiness. Faces conceptualized in terms of anger were remembered as angrier than the same faces conceptualized in terms of happiness, regardless of whether the explanations were told aloud or imagined. Experiments 2 and 3 showed that explanation is necessary for the conceptual biases to emerge fully and extended the finding to anger-sad expressions, an emotion blend more common in real life. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

7.
Decoding facial expressions of emotion is an important aspect of social communication that is often impaired following psychiatric or neurological illness. However, little is known of the cognitive components involved in perceiving emotional expressions. Three dual task studies explored the role of verbal working memory in decoding emotions. Concurrent working memory load substantially interfered with choosing which emotional label described a facial expression (Experiment 1). A key factor in the magnitude of interference was the number of emotion labels from which to choose (Experiment 2). In contrast the ability to decide that two faces represented the same emotion in a discrimination task was relatively unaffected by concurrent working memory load (Experiment 3). Different methods of assessing emotion perception make substantially different demands on working memory. Implications for clinical disorders which affect both working memory and emotion perception are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
10.
Despite the fact that facial expressions of emotion have signal value, there is surprisingly little research examining how that signal can be detected under various conditions, because most judgment studies utilize full-face, frontal views. We remedy this by obtaining judgments of frontal and profile views of the same expressions displayed by the same expressors. We predicted that recognition accuracy when viewing faces in profile would be lower than when judging the same faces from the front. Contrarily, there were no differences in recognition accuracy as a function of view, suggesting that emotions are judged equally well regardless of from what angle they are viewed. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

11.
Collaboration with a local newspaper "yielded a new set of pictures of facial expressions and enabled us to collect judgments on these expressions from 189 newspaper readers." Data based on responses of the 189 readers to each of the 16 posed pictures and data based on responses of 96 college students are presented in a table. The "agreement between the newspaper readers and the students is striking for both medians and quartiles." Collaboration with both local and national newspapers "might supply a very useful population for a wide variety of research problems." (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
The Chimpanzee Facial Action Coding System (ChimpFACS) is an objective, standardized observational tool for measuring facial movement in chimpanzees based on the well-known human Facial Action Coding System (FACS; P. Ekman & W. V. Friesen, 1978). This tool enables direct structural comparisons of facial expressions between humans and chimpanzees in terms of their common underlying musculature. Here the authors provide data on the first application of the ChimpFACS to validate existing categories of chimpanzee facial expressions using discriminant functions analyses. The ChimpFACS validated most existing expression categories (6 of 9) and, where the predicted group memberships were poor, the authors discuss potential problems with ChimpFACS and/or existing categorizations. The authors also report the prototypical movement configurations associated with these 6 expression categories. For all expressions, unique combinations of muscle movements were identified, and these are illustrated as peak intensity prototypical expression configurations. Finally, the authors suggest a potential homology between these prototypical chimpanzee expressions and human expressions based on structural similarities. These results contribute to our understanding of the evolution of emotional communication by suggesting several structural homologies between the facial expressions of chimpanzees and humans and facilitating future research. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
There is considerable evidence indicating that people are primed to monitor social signals of disapproval. Thus far, studies on selective attention have concentrated predominantly on the spatial domain, whereas the temporal consequences of identifying socially threatening information have received only scant attention. Therefore, this study focused on temporal attention costs and examined how the presentation of emotional expressions affects subsequent identification of task-relevant information. High (n = 30) and low (n = 31) socially anxious women were exposed to a dual-target rapid serial visual presentation (RSVP) paradigm. Emotional faces (neutral, happy, angry) were presented as the first target (T1) and neutral letter stimuli (p, q, d, b) as the second target (T2). Irrespective of social anxiety, the attentional blink was relatively large when angry faces were presented as T1. This apparent prioritized processing of angry faces is consistent with evolutionary models, stressing the importance of being especially attentive to potential signals of social threat. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Two studies tested the hypothesis that in judging people's emotions from their facial expressions, Japanese, more than Westerners, incorporate information from the social context. In Study 1, participants viewed cartoons depicting a happy, sad, angry, or neutral person surrounded by other people expressing the same emotion as the central person or a different one. The surrounding people's emotions influenced Japanese but not Westerners' perceptions of the central person. These differences reflect differences in attention, as indicated by eye-tracking data (Study 2): Japanese looked at the surrounding people more than did Westerners. Previous findings on East-West differences in contextual sensitivity generalize to social contexts, suggesting that Westerners see emotions as individual feelings, whereas Japanese see them as inseparable from the feelings of the group. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
High- and low-trait socially anxious individuals classified the emotional expressions of photographic quality continua of interpolated ("morphed") facial images that were derived from combining 6 basic prototype emotional expressions to various degrees, with the 2 adjacent emotions arranged in an emotion hexagon. When fear was 1 of the 2 component emotions, the high-trait group displayed enhanced sensitivity for fear. In a 2nd experiment where a mood manipulation was incorporated, again, the high-trait group exhibited enhanced sensitivity for fear. The low-trait group was sensitive for happiness in the control condition. The moodmanipulated group had increased sensitivity for anger expressions, and trait anxiety did not moderate these effects. Interpretations of the results related to the classification of fearful expressions are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Children's performance on free labeling of prototypical facial expressions of basic emotions is modest and improves only gradually. In 3 data sets (N=80, ages 4 or 5 years; N=160, ages 2 to 5 years; N=80, ages 3 to 4 years), errors remained even when method factors (poor stimuli, unavailability of an appropriate label, or the difficulty of a production task) were controlled. Children's use of emotion labels increased with age in a systematic order: Happy, angry, and sad emerged early and in that order, were more accessible, and were applied broadly (overgeneralized) but systematically. Scared, surprised and disgusted emerged later and often in that order, were less accessible, and were applied narrowly. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
The ability to allocate attention to emotional cues in the enviromnent is an important feature of adaptive self-regulation. Existing data suggest that physically abused children overattend to angry expressions, but the attentional mechanisms underlying such behavior are unknown. The authors tested 8-11-year-old physically abused children to determine whether they displayed specific information-processing problems in a selective attention paradigm using emotional faces as cues. Physically abused children demonstrated delayed disengagement when angry faces served as invalid cues. Abused children also demonstrated increased attentional benefits on valid angry trials. Results are discussed in terms of the influence of early adverse experience on children's selective attention to threat-related signals as a mechanism in the development of psychopathology. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Although many psychological models suggest that human beings are invariably motivated to avoid negative stimuli, more recent theories suggest that people are frequently motivated to approach angering social challenges in order to confront and overcome them. To examine these models, the current investigation sought to determine whether angry facial expressions potentiate approach-motivated motor behaviors. Across 3 studies, individuals were faster to initiate approach movements toward angry facial expressions than to initiate avoidance movements away from such facial expressions. This approach advantage differed significantly from participants’ responses to both emotionally neutral (Studies 1 & 3) and fearful (Study 2) facial expressions. Furthermore, this pattern was most apparent when physical approach appeared to be effective in overcoming the social challenge posed by angry facial expressions (Study 3). The results are discussed in terms of the processes underlying anger-related approach motivation and the conditions under which they are likely to arise. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
Individuals with borderline personality disorder (BPD) have been hypothesized to exhibit significant problems associated with emotional sensitivity. The current study examined emotional sensitivity (i.e., low threshold for recognition of emotional stimuli) in BPD by comparing 20 individuals with BPD and 20 normal controls on their accuracy in identifying emotional expressions. Results demonstrated that, as facial expressions morphed from neutral to maximum intensity, participants with BPD correctly identified facial affect at an earlier stage than did healthy controls. Participants with BPD were more sensitive than healthy controls in identifying emotional expressions in general, regardless of valence. These findings could not be explained by participants with BPD responding faster with more errors. Overall, results appear to support the contention that heightened emotional sensitivity may be a core feature of BPD. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号