首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Converging data suggest that human facial behavior has an evolutionary basis. Combining these data with M. E. Seligman's (1970) preparedness theory, it was predicted that facial expressions of anger should be more readily associated with aversive events than should expressions of happiness. Two experiments involving differential electrodermal conditioning to pictures of faces, with electric shock as the unconditioned stimulus, were performed. In the 1st experiment, 32 undergraduates were exposed to 2 pictures of the same person, 1 with an angry and 1 with a happy expression. For half of the Ss, the shock followed the angry face, and for the other half, it followed the happy face. In the 2nd experiment, 3 groups of 48 undergraduates differentiated between pictures of male and female faces, both showing angry, neutral, and happy expressions. Responses to angry CSs showed significant resistance to extinction in both experiments, with a larger effect in Exp II. Responses to happy or neutral CSs, on the other hand, extinguished immediately when the shock was withheld. Results are related to conditioning to phobic stimuli and to the preparedness theory. (22 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
Many research reports have concluded that emotional information can be processed without observers being aware of it. The case for perception without awareness has almost always been made with the use of facial expressions. In view of the similarities between facial and bodily expressions for rapid perception and communication of emotional signals, we conjectured that perception of bodily expressions may also not necessarily require visual awareness. Our study investigates the role of visual awareness in the perception of bodily expressions using a backward masking technique in combination with confidence ratings on a trial-by-trial basis. Participants had to detect in three separate experiments masked fearful, angry and happy bodily expressions among masked neutral bodily actions as distractors and subsequently the participants had to indicate their confidence. The onset between target and mask (Stimulus Onset Asynchrony, SOA) varied from ?50 to +133 ms. Sensitivity measurements (d-prime) as well as the confidence of the participants showed that the bodies could be detected reliably in all SOA conditions. In an important finding, a lack of covariance was observed between the objective and subjective measurements when the participants had to detect fearful bodily expressions, yet this was not the case when participants had to detect happy or angry bodily expressions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

3.
4.
5.
We investigated facial recognition memory (for previously unfamiliar faces) and facial expression perception with functional magnetic resonance imaging (fMRI). Eight healthy, right-handed volunteers participated. For the facial recognition task, subjects made a decision as to the familiarity of each of 50 faces (25 previously viewed; 25 novel). We detected signal increase in the right middle temporal gyrus and left prefrontal cortex during presentation of familiar faces, and in several brain regions, including bilateral posterior cingulate gyri, bilateral insulae and right middle occipital cortex during presentation of unfamiliar faces. Standard facial expressions of emotion were used as stimuli in two further tasks of facial expression perception. In the first task, subjects were presented with alternating happy and neutral faces; in the second task, subjects were presented with alternating sad and neutral faces. During presentation of happy facial expressions, we detected a signal increase predominantly in the left anterior cingulate gyrus, bilateral posterior cingulate gyri, medial frontal cortex and right supramarginal gyrus, brain regions previously implicated in visuospatial and emotion processing tasks. No brain regions showed increased signal intensity during presentation of sad facial expressions. These results provide evidence for a distinction between the neural correlates of facial recognition memory and perception of facial expression but, whilst highlighting the role of limbic structures in perception of happy facial expressions, do not allow the mapping of a distinct neural substrate for perception of sad facial expressions.  相似文献   

6.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Examined the influence of changes in facial expression on physiological and self-report measures of emotion. In Exp I, 27 undergraduates portrayed facial expressions associated with being afraid, calm, and normal. Portraying fear produced increases in pulse rate and skin conductance relative to portraying either calm or normal, but posing had no effect on subjective reports of anxiety (Affect Adjective Check List). In Exp II, 38 Ss listened to loud or soft noise while changing their expressions to portray fear, happiness, or calmness. Portraying either fear or happiness produced greater arousal than remaining calm. Changes in facial expression failed to affect self-reports of noise loudness. Results suggest that changes in facial expression influence physiological responses through the movement involved in posing and may not influence self-reports of emotion at all. (18 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
The decrease in recognition performance after face inversion has been taken to suggest that faces are processed holistically. Three experiments, 1 with schematic and 2 with photographic faces, were conducted to assess whether face inversion also affected visual search for and implicit evaluation of facial expressions of emotion. The 3 visual search experiments yielded the same differences in detection speed between different facial expressions of emotion for upright and inverted faces. Threat superiority effects, faster detection of angry than of happy faces among neutral background faces, were evident in 2 experiments. Face inversion did not affect explicit or implicit evaluation of face stimuli as assessed with verbal ratings and affective priming. Happy faces were evaluated as more positive than angry, sad, or fearful/scheming ones regardless of orientation. Taken together these results seem to suggest that the processing of facial expressions of emotion is not impaired if holistic processing is disrupted. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
Examined whether spontaneous facial expressions provide observers with sufficient information to distinguish accurately which of 7 affective states (6 emotional and 1 neutral) is being experienced by another person. Six undergraduate senders' facial expressions were covertly videotaped as they watched emotionally loaded slides. After each slide, senders nominated the emotions term that best described their affective reaction and also rated the pleasantness and strength of that reaction. Similar nominations of emotion terms and ratings were later made by 53 undergraduate receivers who viewed the senders' videotaped facial expression. The central measure of communication accuracy was the match between senders' and receivers' emotion nominations. Overall accuracy was significantly greater than chance, although it was not impressive in absolute terms. Only happy, angry, and disgusted expressions were recognized at above-chance rates, whereas surprised expressions were recognized at rates that were significantly worse than chance. Female Ss were significantly better senders than were male Ss. Although neither sex was found to be better at receiving facial expressions, female Ss were better receivers of female senders' expressions than of male senders' expressions. Female senders' neutral and surprised expressions were more accurately recognized than were those of male senders. The only sex difference found for decoding emotions was a tendency for male Ss to be more accurate at recognizing anger. (25 ref) (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

11.
Previous research has demonstrated that particular facial expressions more readily acquire excitatory strength when paired with a congruent unconditioned stimulus than when paired with an incongruent outcome. The present study with a total of 36 undergraduates extends these findings on the excitatory inhibitory role of facial expressions by demonstrating that particular facial expressions (fear and happy), when paired with a neutral cue (tone), can influence conditioning to the neutral conditioned stimulus (CS). Ss who had a fear expression paired with the neutral CS responded more to the fear expression than to the neutral CS, whereas Ss who had a happy expression paired with the neutral CS responded more to the neutral cue than to the happy expression. These findings strongly support predictions from "overshadowing" or "blocking" models of classical conditioning. (12 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
The facial expressions of adults with Down's syndrome (DS; n?=?15) as they watched happy, sad, and neutral videotapes were compared with those of a healthy age-matched control group (n?=?20). Facial movements were analyzed with the Facial Action Coding System (P. E. Ekman & W. V. Friesen, 1978). While watching happy stimuli, the 10 DS adults who were able to appropriately rate their reactions smiled with a cheek raise as frequently as control adults, suggesting that the expression of positive affect in these individuals is normal. Contrary to predictions, however, the DS group exhibited fewer smiles without cheek raises than did control adults and were more likely not to smile. Neither group showed prototypic sad facial expressions in response to sad stimuli. Independent of emotion, DS participants made more facial movements, including more tongue shows, than did control participants. Differences in facial expression in DS adults may confuse others' interpretations of their emotional responses and may be important for understanding the development of abnormal emotional processes. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

14.
In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

16.
Three studies explored the role of hedonic contingency theory as an explanation for the link between positive mood and cognitive flexibility. Study 1 examined the determinants of activity choice for participants in happy, sad, or neutral moods. Consistent with hedonic contingency theory, happy participants weighted potential for creativity as well as the pleasantness of the task more heavily in their preference ratings. In Study 2, participants were given either a neutral or mood-threatening item generation task to perform. Results illustrated that happy participants exhibited greater cognitive flexibility in all cases; when confronted with a potentially mood-threatening task, happy participants were able to creatively transform the task so as to maintain positive mood and interest. Finally, Study 3 manipulated participants' beliefs that moods could or could not be altered. Results replicated the standard positive mood-increased cognitive flexibility effect in the nonmood-freezing condition, but no effects of mood on creativity were found in the mood-freezing condition. These studies indicate that the hedonic contingency theory may be an important contributing mechanism behind the positive mood-cognitive flexibility link. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Socioemotional selectivity theory postulates that with age, people are motivated to derive emotional meaning from life, leading them to pay more attention to positive relative to negative/neutral stimuli. The authors argue that cultures that differ in what they consider to be emotionally meaningful may show this preference to different extents. Using eye-tracking techniques, the authors compared visual attention toward emotional (happy, fearful, sad, and angry) and neutral facial expressions among 46 younger and 57 older Hong Kong Chinese. In contrast to prior Western findings, older but not younger Chinese looked away from happy facial expressions, suggesting that they do not show attentional preferences toward positive stimuli. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
The present electromyographic study is a first step toward shedding light on the involvement of affective processes in congruent and incongruent facial reactions to facial expressions. Further, empathy was investigated as a potential mediator underlying the modulation of facial reactions to emotional faces in a competitive, a cooperative, and a neutral setting. Results revealed less congruent reactions to happy expressions and even incongruent reactions to sad and angry expressions in the competition condition, whereas virtually no differences between the neutral and the cooperation condition occurred. Effects on congruent reactions were found to be mediated by cognitive empathy, indicating that the state of empathy plays an important role in the situational modulation of congruent reactions. Further, incongruent reactions to sad and angry faces in a competition setting were mediated by the emotional reaction of joy, supporting the assumption that incongruent facial reactions are mainly based on affective processes. Additionally, strategic processes (specifically, the goal to create and maintain a smooth, harmonious interaction) were found to influence facial reactions while being in a cooperative mindset. Now, further studies are needed to test for the generalizability of these effects. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

19.
The ability to imitate facial expressions was examined in 26 newborns. Each subject observed a model showing tongue protrusion or a happy, sad, or suprised face. The frequencies of reproduction of a modeled act were compared with the average frequencies of the act during periods when other actions were modeled. A trials-to-criterion design was used. When infants observed the emotional facial expressions (happy, sad, surprise), they often responded by opening their mouths or showing lip pouts, but did not show imitative-like matching of these modeled expressions. However, when tongue protrusion was modeled, the infants did reproduce the modeled gesture. These data raise the question of whether the infants' responses to modeled facial expressions reflect true imitation, stimulus-evoked elicitation, or a stereotyped "facial gesture." (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号