首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The authors examined regulation of the discrete emotions anger and sadness in adolescents through older adults in the context of describing everyday problem situations. The results support previous work; in comparison to younger age groups, older adults reported that they experienced less anger and reported that they used more passive and fewer proactive emotion-regulation strategies in interpersonal situations. The experience of anger partially mediated age differences in the use of proactive emotion regulation. This suggests that at least part of the reason why older adults use fewer proactive emotion-regulation strategies is their decreased experience of anger. Results are discussed in the context of lifespan theories of emotional development. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
The authors conducted 2 studies to identify the vocal acoustical correlates of unresolved anger and sadness among women reporting unresolved anger toward an attachment figure. In Study 1, participants (N = 17) were induced to experience and express anger then sadness or sadness then anger. In Study 2, a 2nd group of participants (N = 22) underwent a relationship-oriented, emotion-focused analogue therapy session. Results from both studies showed that, relative to emotionally neutral speech, anger evoked an increase in articulation rate and in mean fundamental frequency (F0) and F0-range, whereas sadness evoked an increase in F0-perturbation. Both F0 and F0-range were larger for anger than for sadness. In addition, results from the mood-induction-procedure study revealed 2 Emotion×Order interactions. Whereas variations in amplitude range suggested that anger evoked less physiological activation when induced after sadness, variations in F0-perturbation suggested that sadness evoked more physiological activation when induced after anger. These findings illustrate the feasibility of using acoustical measures to identify clients' personally and clinically meaningful emotional experiences, and shifts between such emotional experiences, in the context of psychotherapy. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
This study was designed to identify physiological correlates of unresolved anger and sadness, and the shift between these emotions, in a context similar to that of emotion-focused, experiential psychotherapy. Twenty-seven university students reporting unresolved anger toward an attachment figure were induced to experience and express unresolved anger and sadness. Simultaneously, their heart rate variability, finger temperature, and skin conductance levels were monitored. The sequence of emotion induction was counterbalanced. Sympathetic activation, as reflected by finger temperature, increased significantly from anger to sadness, but not from sadness to anger. A follow-up study (N=36) of participants induced to experience and express either anger or sadness in both the 1st and 2nd inductions ruled out an Anger×Time interaction and a sadness-sadness effect, suggesting that the increase in sympathetic activation from anger to sadness was a function of the unique sequence of emotions. These findings represent a first step toward using physiological measures to capture shifts from unresolved anger to vulnerable primary emotions during a therapy-like task and provide evidence for the purported mechanism underlying unresolved anger. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
The current series of studies provide converging evidence that facial expressions of fear and anger may have co-evolved to mimic mature and babyish faces in order to enhance their communicative signal. In Studies 1 and 2, fearful and angry facial expressions were manipulated to have enhanced babyish features (larger eyes) or enhanced mature features (smaller eyes) and in the context of a speeded categorization task in Study 1 and a visual noise paradigm in Study 2, results indicated that larger eyes facilitated the recognition of fearful facial expressions, while smaller eyes facilitated the recognition of angry facial expressions. Study 3 manipulated facial roundness, a stable structure that does not vary systematically with expressions, and found that congruency between maturity and expression (narrow face-anger; round face-fear) facilitated expression recognition accuracy. Results are discussed as representing a broad co-evolutionary relationship between facial maturity and fearful and angry facial expressions. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
Three experiments tested the hypothesis that explaining emotional expressions using specific emotion concepts at encoding biases perceptual memory for those expressions. In Experiment 1, participants viewed faces expressing blends of happiness and anger and created explanations of why the target people were expressing one of the two emotions, according to concepts provided by the experimenter. Later, participants attempted to identify the facial expressions in computer movies, in which the previously seen faces changed continuously from anger to happiness. Faces conceptualized in terms of anger were remembered as angrier than the same faces conceptualized in terms of happiness, regardless of whether the explanations were told aloud or imagined. Experiments 2 and 3 showed that explanation is necessary for the conceptual biases to emerge fully and extended the finding to anger-sad expressions, an emotion blend more common in real life. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
In this set of studies, we examine the perceptual similarities between emotions that share either a valence or a motivational direction. Determination is a positive approach-related emotion, whereas anger is a negative approach-related emotion. Thus, determination and anger share a motivational direction but are opposite in valence. An implemental mind-set has previously been shown to produce high-approach-motivated positive affect. Thus, in Study 1, participants were asked to freely report the strongest emotion they experienced during an implemental mind-set. The most common emotion reported was determination. On the basis of this result, we compared the facial expression of determination with that of anger. In Study 2, naive judges were asked to identify photographs of facial expressions intended to express determination, along with photographs intended to express basic emotions (joy, anger, sadness, fear, disgust, neutral). Correct identifications of intended determination expressions were correlated with misidentifications of the expressions as anger but not with misidentifications as any other emotion. This suggests that determination, a high-approach-motivated positive affect, is perceived as similar to anger. In Study 3, naive judges quantified the intensity of joy, anger, and determination expressed in photographs. The intensity of perceived determination was directly correlated with the intensity of perceived anger (a high-approach-motivated negative affect) and was inversely correlated with the intensity of perceived joy (a low-approach-motivated positive affect). These results demonstrate perceptual similarity between emotions that share a motivational direction but differ in valence. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
The appraisal process consists of the subjective evaluation that occurs during an individual's encounter with significant events in the environment, determining the nature of the emotional reaction and experience. Placed in the context of appraisal theories of emotion-elicitation and differentiation, the aim of the present research was to test empirically the hypothesis that the intrinsic pleasantness evaluation occurs before the goal conduciveness evaluation. In two studies, intrinsically pleasant and unpleasant images were used to manipulate pleasantness, and a specific event in a Pacman-type videogame was used to manipulate goal conduciveness. Facial EMG was used to measure facial reactions to each evaluation. As predicted, facial reactions to the intrinsic pleasantness manipulation were faster than facial reactions to the goal conduciveness manipulation. These results provide good empirical support for the sequential nature of the appraisal process. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

8.
Functional magnetic resonance imaging (fMRI) of the human brain was used to compare changes in amygdala activity associated with viewing facial expressions of fear and anger. Pictures of human faces bearing expressions of fear or anger, as well as faces with neutral expressions, were presented to 8 healthy participants. The blood oxygen-level dependent (BOLD) fMRI signal within the dorsal amygdala was significantly greater to Fear versus Anger, in a direct contrast. Significant BOLD signal changes in the ventral amygdala were observed in contrasts of Fear versus Neutral expressions and, in a more spatially circumscribed region, to Anger versus Neutral expressions. Thus, activity in the amygdala is greater to fearful facial expressions when contrasted with either neutral or angry faces. Furthermore, directly contrasting fear with angry faces highlighted involvement of the dorsal amygdaloid region. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
Young children's temper tantrums offer a unique window into the expression and regulation of strong emotions. Previous work, largely based on parental report, suggests that two emotions, anger and sadness, have different behavioral manifestations and different time courses within tantrums. Individual motor and vocal behaviors, reported by parents, have been interpreted as representing different levels of intensity within each emotion category. The present study used high-fidelity audio recordings to capture the acoustic features of children's vocalizations during tantrums. Results indicated that perceptually categorized screaming, yelling, crying, whining, and fussing each have distinct acoustic features. Screaming and yelling form a group with similar acoustic features while crying, whining, and fussing form a second acoustically related group. Within these groups, screaming may reflect a higher intensity of anger than yelling while fussing, whining, and crying may reflect an increasing intensity of sadness. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

10.
Evidence for A. J. Fridlund's (e.g., 1994) "behavioral ecology view" of human facial expression comes primarily from studies of smiling in response to positive emotional stimuli. Smiling may be a special case because it clearly can, and often does serve merely communicative functions. The present study was designated (a) to assess the generalizability of social context effects to facial expressions in response to negative emotional stimuli and (b) to examine whether these effects are mediated by social motives, as suggested by the behavioral ecology view. Pairs of friends or strangers viewed film clips that elicited different degrees of sad affect, in either the same or a different room; a control group participated alone. Dependent variables included facial activity, subjective emotion, and social motives. Displays of sadness were influenced by stimulus intensity and were lower in all social conditions than in the alone condition. Unexpectedly, social context effects were also found for smiling. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

11.
"Two experiments were performed to study the effect of subliminal and supraliminal suggestion on productivity… in describing a picture… . [A control] group was given a task of describing in writing a TAT picture presented tachistoscopically 10 times at increasing exposure levels. In the second condition a subliminal stimulus, the words WRITE MORE overlapped with the TAT picture for .02 seconds. In the third condition the subliminal suggestion was DON'T WRITE… . A second experiment was performed to see the effect on productivity of the same suggestions at supraliminal levels… . subliminal suggestion may produce some effect in the region just below threshold. When the suggestion becomes supraliminal its distracting effects causes a contrasuggestive response in some Ss." (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
13.
The authors experimentally examined the effects of anger suppression on pain perception. On the basis of ironic process theory, they proposed that efforts to suppress experiential or expressive components of anger may paradoxically enhance cognitive accessibility of anger-related thoughts and feelings, thereby contaminating perception of succeeding pain in an anger-congruent manner. Participants were randomly assigned to nonsuppression or experiential or expressive suppression conditions during mental arithmetic with or without harassment. A cold-pressor task followed. Results revealed that participants instructed to suppress experiential or expressive components of emotion during harassment not only reported the greatest pain levels, but also rated the anger-specific dimensions of pain uniquely strong. Results suggest that attempts to suppress anger may amplify pain sensitivity by ironically augmenting perception of the irritating and frustrating qualities of pain. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
Facial autonomic responses may contribute to emotional communication and reveal individual affective style. In this study, the authors examined how observed pupillary size modulates processing of facial expression, extending the finding that incidentally perceived pupils influence ratings of sadness but not those of happy, angry, or neutral facial expressions. Healthy subjects rated the valence and arousal of photographs depicting facial muscular expressions of sadness, surprise, fear, and disgust. Pupil sizes within the stimuli were experimentally manipulated. Subjects themselves were scored with an empathy questionnaire. Diminishing pupil size linearly enhanced intensity and valence judgments of sad expressions (but not fear, surprise, or disgust). At debriefing, subjects were unaware of differences in pupil size across stimuli. These observations complement an earlier study showing that pupil size directly influences processing of sadness but not other basic emotional facial expressions. Furthermore, across subjects, the degree to which pupil size influenced sadness processing correlated with individual differences in empathy score. Together, these data demonstrate a central role of sadness processing in empathetic emotion and highlight the salience of implicit autonomic signals in affective communication. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
[Correction Notice: An erratum for this article was reported in Vol 11(4) of Emotion (see record 2011-18271-001). There were several errors in Table 1, and in Table 4 spaces were omitted from the rows between data for anger, fear, sadness, joy, and relief. All versions of this article have been corrected, and the corrections to Table 1 are provided in the erratum.] Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

16.
Two experiments examined how different frustration contexts affect the instrumental and emotional responses of 4- to 5-month-old infants. Three different frustrating contexts were investigated: loss of stimulation (extinction), reduction in contingent stimulation (partial reinforcement), and loss of stimulus control (noncontingency). In both experiments, changes in arm activity and facial expressions of anger and sadness coded according to the Maximally Discriminative Facial Movement Coding System (MAX) were the measures of frustration. Both experiments showed that (a) arm responses increased when the contingent stimulus was lost or reduced but decreased when control of the stimulus was lost under noncontingency, (b) MAX-coded anger, but not MAX-coded sad or blends of anger and sad, was associated with frustration, and (c) the pattern of anger and arm responses varied with the frustration context. When contingent stimulation was lost or reduced, both anger and arm responses increased, but when expected control was lost under noncontingency, arm responses decreased while anger increased. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Efficient navigation of our social world depends on the generation, interpretation, and combination of social signals within different sensory systems. However, the influence of healthy adult aging on multisensory integration of emotional stimuli remains poorly explored. This article comprises 2 studies that directly address issues of age differences on cross-modal emotional matching and explicit identification. The first study compared 25 younger adults (19–40 years) and 25 older adults (60–80 years) on their ability to match cross-modal congruent and incongruent emotional stimuli. The second study looked at performance of 20 younger (19–40) and 20 older adults (60–80) on explicit emotion identification when information was presented congruently in faces and voices or only in faces or in voices. In Study 1, older adults performed as well as younger adults on tasks in which congruent auditory and visual emotional information were presented concurrently, but there were age-related differences in matching incongruent cross-modal information. Results from Study 2 indicated that though older adults were impaired at identifying emotions from 1 modality (faces or voices alone), they benefited from congruent multisensory information as age differences were eliminated. The findings are discussed in relation to social, emotional, and cognitive changes with age. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Decoding facial expressions of emotion is an important aspect of social communication that is often impaired following psychiatric or neurological illness. However, little is known of the cognitive components involved in perceiving emotional expressions. Three dual task studies explored the role of verbal working memory in decoding emotions. Concurrent working memory load substantially interfered with choosing which emotional label described a facial expression (Experiment 1). A key factor in the magnitude of interference was the number of emotion labels from which to choose (Experiment 2). In contrast the ability to decide that two faces represented the same emotion in a discrimination task was relatively unaffected by concurrent working memory load (Experiment 3). Different methods of assessing emotion perception make substantially different demands on working memory. Implications for clinical disorders which affect both working memory and emotion perception are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
Although many psychological models suggest that human beings are invariably motivated to avoid negative stimuli, more recent theories suggest that people are frequently motivated to approach angering social challenges in order to confront and overcome them. To examine these models, the current investigation sought to determine whether angry facial expressions potentiate approach-motivated motor behaviors. Across 3 studies, individuals were faster to initiate approach movements toward angry facial expressions than to initiate avoidance movements away from such facial expressions. This approach advantage differed significantly from participants’ responses to both emotionally neutral (Studies 1 & 3) and fearful (Study 2) facial expressions. Furthermore, this pattern was most apparent when physical approach appeared to be effective in overcoming the social challenge posed by angry facial expressions (Study 3). The results are discussed in terms of the processes underlying anger-related approach motivation and the conditions under which they are likely to arise. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

20.
The authors examined whether facial expressions of emotion would predict changes in heart function. One hundred fifteen male patients with coronary artery disease underwent the Type A Structured Interview, during which time measures of transient myocardial ischemia (wall motion abnormality and left ventricular ejection fraction) were obtained. Facial behavior exhibited during the ischemia measurement period was videotaped and later coded by using the Facial Action Coding System (P. Ekman & W. V. Friesen, 1978). Those participants who exhibited ischemia showed significantly more anger expressions and nonenjoyment smiles than nonischemics. Cook–Medley Hostility scores did not vary with ischemic status. The findings have implications for understanding how anger and hostility differentially influence coronary heart disease risk. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号