首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
机器的情感是通过融入具有情感能力的智能体实现的,虽然目前在人机交互领域已经有大量研究成果,但有关智能体情感计算方面的研究尚处起步阶段,深入开展这项研究对推动人机交互领域的发展具有重要的科学和应用价值。本文通过检索Scopus数据库选择有代表性的文献,重点关注情感在智能体和用户之间的双向流动,分别从智能体对用户的情绪感知和对用户情绪调节的角度开展分析总结。首先梳理了用户情绪的识别方法,即通过用户的表情、语音、姿态、生理信号和文本信息等多通道信息分析用户的情绪状态,归纳了情绪识别中的一些机器学习方法。其次从用户体验角度分析具有情绪表现力的智能体对用户的影响,总结了智能体的情绪生成和表现技术,指出智能体除了通过表情之外,还可以通过注视、姿态、头部运动和手势等非言语动作来表现情绪。并且梳理了典型的智能体情绪架构,举例说明了强化学习在智能体情绪设计中的作用。同时为了验证模型的准确性,比较了已有的情感评估手段和评价指标。最后指出智能体情感计算急需解决的问题。通过对现有研究的总结,智能体情感计算研究是一个很有前景的研究方向,希望本文能够为深入开展相关研究提供借鉴。  相似文献   

2.
Affectively intelligent and adaptive car interfaces   总被引:1,自引:0,他引:1  
Fatma Nasoz 《Information Sciences》2010,180(20):3817-3836
In this article, we describe a new approach to enhance driving safety via multi-media technologies by recognizing and adapting to drivers’ emotions with multi-modal intelligent car interfaces. The primary objective of this research was to build an affectively intelligent and adaptive car interface that could facilitate a natural communication with its user (i.e., the driver). This objective was achieved by recognizing drivers’ affective states (i.e., emotions experienced by the drivers) and by responding to those emotions by adapting to the current situation via an affective user model created for each individual driver. A controlled experiment was designed and conducted in a virtual reality environment to collect physiological data signals (galvanic skin response, heart rate, and temperature) from participants who experienced driving-related emotions and states (neutrality, panic/fear, frustration/anger, and boredom/sleepiness). k-Nearest Neighbor (KNN), Marquardt-Backpropagation (MBP), and Resilient Backpropagation (RBP) Algorithms were implemented to analyze the collected data signals and to find unique physiological patterns of emotions. RBP was the best classifier of these three emotions with 82.6% accuracy, followed by MBP with 73.26% and by KNN with 65.33%. Adaptation of the interface was designed to provide multi-modal feedback to the users about their current affective state and to respond to users’ negative emotional states in order to decrease the possible negative impacts of those emotions. Bayesian Belief Networks formalization was employed to develop the user model to enable the intelligent system to appropriately adapt to the current context and situation by considering user-dependent factors, such as personality traits and preferences.  相似文献   

3.
情感在感知、决策、逻辑推理和社交等一系列智能活动中起到核心作用,是实现人机交互和机器智能的重要元素。近年来,随着多媒体数据爆发式增长及人工智能的快速发展,情感计算与理解引发了广泛关注。情感计算与理解旨在赋予计算机系统识别、理解、表达和适应人的情感的能力来建立和谐人机环境,并使计算机具有更高、更全面的智能。根据输入信号的不同,情感计算与理解包含不同的研究方向。本文全面回顾了多模态情感识别、孤独症情感识别、情感图像内容分析以及面部表情识别等不同情感计算与理解方向在过去几十年的研究进展并对未来的发展趋势进行展望。对于每个研究方向,首先介绍了研究背景、问题定义和研究意义;其次从不同角度分别介绍了国际和国内研究现状,包括情感数据标注、特征提取、学习算法、部分代表性方法的性能比较和分析以及代表性研究团队等;然后对国内外研究进行了系统比较,分析了国内研究的优势和不足;最后讨论了目前研究存在的问题及未来的发展趋势与展望,例如考虑个体情感表达差异问题和用户隐私问题等。  相似文献   

4.
The ability to recognize emotion is one of the hallmarks of emotional intelligence, an aspect of human intelligence that has been argued to be even more important than mathematical and verbal intelligences. This paper proposes that machine intelligence needs to include emotional intelligence and demonstrates results toward this goal: developing a machine's ability to recognize the human affective state given four physiological signals. We describe difficult issues unique to obtaining reliable affective data and collect a large set of data from a subject trying to elicit and experience each of eight emotional states, daily, over multiple weeks. This paper presents and compares multiple algorithms for feature-based recognition of emotional state from this data. We analyze four physiological signals that exhibit problematic day-to-day variations: The features of different emotions on the same day tend to cluster more tightly than do the features of the same emotion on different days. To handle the daily variations, we propose new features and algorithms and compare their performance. We find that the technique of seeding a Fisher Projection with the results of sequential floating forward search improves the performance of the Fisher Projection and provides the highest recognition rates reported to date for classification of affect from physiology: 81 percent recognition accuracy on eight classes of emotion, including neutral  相似文献   

5.
In human–computer interaction (HCI), electroencephalogram (EEG) signals can be added as an additional input to computer. An integration of real-time EEG-based human emotion recognition algorithms in human–computer interfaces can make the users experience more complete, more engaging, less emotionally stressful or more stressful depending on the target of the applications. Currently, the most accurate EEG-based emotion recognition algorithms are subject-dependent, and a training session is needed for the user each time right before running the application. In this paper, we propose a novel real-time subject-dependent algorithm with the most stable features that gives a better accuracy than other available algorithms when it is crucial to have only one training session for the user and no re-training is allowed subsequently. The proposed algorithm is tested on an affective EEG database that contains five subjects. For each subject, four emotions (pleasant, happy, frightened and angry) are induced, and the affective EEG is recorded for two sessions per day in eight consecutive days. Testing results show that the novel algorithm can be used in real-time emotion recognition applications without re-training with the adequate accuracy. The proposed algorithm is integrated with real-time applications “Emotional Avatar” and “Twin Girls” to monitor the users emotions in real time.  相似文献   

6.
In this article we describe the use of mental states approach, more specifically the belief-desire-intention (BDI) model, to implement the process of affective diagnosis in an educational environment. We use the psychological OCC model, which is based on the cognitive theory of emotions and is possible to be implemented computationally, in order to infer the learner’s emotions from his actions in the system interface. In our work we profit from the reasoning capacity of the BDI model in order to infer the student’s appraisal (a cognitive evaluation of a person that elicits an emotion), which allows us to deduce student’s emotions. The system reasons about an emotion-generating situation and tries to infer the user’s emotion by using the OCC model. Besides, the BDI model is very adequate to infer and also model students affective states since the emotions have a dynamic nature.  相似文献   

7.
A growing body of research suggests that affective computing has many valuable applications in enterprise systems research and e-businesses. This paper explores affective computing techniques for a vital sub-area in enterprise systems—consumer satisfaction measurement. We propose a linguistic-based emotion analysis and recognition method for measuring consumer satisfaction. Using an annotated emotion corpus (Ren-CECps), we first present a general evaluation of customer satisfaction by comparing the linguistic characteristics of emotional expressions of positive and negative attitudes. The associations in four negative emotions are further investigated. After that, we build a fine-grained emotion recognition system based on machine learning algorithms for measuring customer satisfaction; it can detect and recognize multiple emotions using customers’ words or comments. The results indicate that blended emotion recognition is able to gain rich feedback data from customers, which can provide more appropriate follow-up for customer relationship management.  相似文献   

8.
基于个性和OCC的机器人情感建模研究   总被引:1,自引:3,他引:1  
机器人不仅要具有简单的机械作业和逻辑推理能力,还应当具有类似人类的情感能力.本文将个性与情绪、情感、理解、表达相结合,采用OCC模型作为评价标准,建立了符合人类情感规律的、可用于情感机器人的情感模型。通过一个应用上述模型的虚拟人情感交互系统.验证了此模型可以很好的对人类的情感进行仿真.可以应用于情感机器人和人性化计算机、游戏等许多领域。  相似文献   

9.
This article provides the first survey of computational models of emotion in reinforcement learning (RL) agents. The survey focuses on agent/robot emotions, and mostly ignores human user emotions. Emotions are recognized as functional in decision-making by influencing motivation and action selection. Therefore, computational emotion models are usually grounded in the agent’s decision making architecture, of which RL is an important subclass. Studying emotions in RL-based agents is useful for three research fields. For machine learning (ML) researchers, emotion models may improve learning efficiency. For the interactive ML and human–robot interaction community, emotions can communicate state and enhance user investment. Lastly, it allows affective modelling researchers to investigate their emotion theories in a successful AI agent class. This survey provides background on emotion theory and RL. It systematically addresses (1) from what underlying dimensions (e.g. homeostasis, appraisal) emotions can be derived and how these can be modelled in RL-agents, (2) what types of emotions have been derived from these dimensions, and (3) how these emotions may either influence the learning efficiency of the agent or be useful as social signals. We also systematically compare evaluation criteria, and draw connections to important RL sub-domains like (intrinsic) motivation and model-based RL. In short, this survey provides both a practical overview for engineers wanting to implement emotions in their RL agents, and identifies challenges and directions for future emotion-RL research.  相似文献   

10.
How we design and evaluate for emotions depends crucially on what we take emotions to be. In affective computing, affect is often taken to be another kind of information—discrete units or states internal to an individual that can be transmitted in a loss-free manner from people to computational systems and back. While affective computing explicitly challenges the primacy of rationality in cognitivist accounts of human activity, at a deeper level it often relies on and reproduces the same information-processing model of cognition. Drawing on cultural, social, and interactional critiques of cognition which have arisen in human–computer interaction (HCI), as well as anthropological and historical accounts of emotion, we explore an alternative perspective on emotion as interaction: dynamic, culturally mediated, and socially constructed and experienced. We demonstrate how this model leads to new goals for affective systems—instead of sensing and transmitting emotion, systems should support human users in understanding, interpreting, and experiencing emotion in its full complexity and ambiguity. In developing from emotion as objective, externally measurable unit to emotion as experience, evaluation, too, alters focus from externally tracking the circulation of emotional information to co-interpreting emotions as they are made in interaction.  相似文献   

11.
When used as an interface in the context of Ambient Assisted Living (AAL), a social robot should not just provide a task-oriented support. It should also try to establish a social empathic relation with the user. To this aim, it is crucial to endow the robot with the capability of recognizing the user’s affective state and reason on it for triggering the most appropriate communicative behavior. In this paper we describe how such an affective reasoning has been implemented in the NAO robot for simulating empathic behaviors in the context of AAL. In particular, the robot is able to recognize the emotion of the user by analyzing communicative signals extracted from speech and facial expressions. The recognized emotion allows triggering the robot’s affective state and, consequently, the most appropriate empathic behavior. The robot’s empathic behaviors have been evaluated both by experts in communication and through a user study aimed at assessing the perception and interpretation of empathy by elderly users. Results are quite satisfactory and encourage us to further extend the social and affective capabilities of the robot.  相似文献   

12.
The mystery surrounding emotions, how they work and how they affect our lives has not yet been unravelled. Scientists still debate the real nature of emotions, whether they are evolutionary, physiological or cognitive are just a few of the different approaches used to explain affective states. Regardless of the various emotional paradigms, neurologists have made progress in demonstrating that emotion is as, or more, important than reason in the process of making decisions and deciding actions. The significance of these findings should not be overlooked in a world that is increasingly reliant on computers to accommodate to user needs. In this paper, a novel approach for recognizing and classifying positive and negative emotional changes in real time using physiological signals is presented. Based on sequential analysis and autoassociative networks, the emotion detection system outlined here is potentially capable of operating on any individual regardless of their physical state and emotional intensity without requiring an arduous adaptation or pre-analysis phase. Results from applying this methodology on real-time data collected from a single subject demonstrated a recognition level of 71.4% which is comparable to the best results achieved by others through off-line analysis. It is suggested that the detection mechanism outlined in this paper has all the characteristics needed to perform emotion recognition in pervasive computing.  相似文献   

13.
Relaxation training is an application of affective computing with important implications for health and wellness. After detecting user׳s affective state through physiological sensors, a relaxation training application can provide the user with explicit feedback about his/her detected affective state. This process (biofeedback) can enable an individual to learn over time how to change his/her physiological activity for the purposes of improving health and performance. In this paper, we provide three contributions to the field of affective computing for health and wellness. First, we propose a novel application for relaxation training that combines ideas from affective computing and games. The game detects user׳s level of stress and uses it to influence the affective state and the behavior of a 3D virtual character as a form of embodied feedback. Second, we compare two algorithms for stress detection which follow two different approaches in the affective computing literature: a more practical and less costly approach that uses a single physiological sensor (skin conductance), and a potentially more accurate approach that uses four sensors (skin conductance, heart rate, muscle activity of corrugator supercilii and zygomaticus major). Third, as the central motivation of our research, we aim to improve the traditional methodology employed for comparisons in affective computing studies. To do so, we add to the study a placebo condition in which user׳s stress level, unbeknown to him/her, is determined pseudo-randomly instead of taking into account his/her physiological sensor readings. The obtained results show that only the feedback presented by the single-sensor algorithm was perceived as significantly more accurate than the placebo. If the placebo condition was not included in the study, the effectiveness of the two algorithms would have instead appeared similar. This outcome highlights the importance of using more thorough methodologies in future affective computing studies.  相似文献   

14.

Recommender systems have become ubiquitous over the last decade, providing users with personalized search results, video streams, news excerpts, and purchasing hints. Human emotions are widely regarded as important predictors of behavior and preference. They are a crucial factor in decision making, but until recently, relatively little has been known about the effectiveness of using human emotions in personalizing real-world recommender systems. In this paper we introduce the Emotion Aware Recommender System (EARS), a large scale system for recommending news items using user’s self-assessed emotional reactions. Our original contribution includes the formulation of a multi-dimensional model of emotions for news item recommendations, introduction of affective item features that can be used to describe recommended items, construction of affective similarity measures, and validation of the EARS on a large corpus of real-world Web traffic. We collect over 13,000,000 page views from 2,700,000 unique users of two news sites and we gather over 160,000 emotional reactions to 85,000 news articles. We discover that incorporating pleasant emotions into collaborative filtering recommendations consistently outperforms all other algorithms. We also find that targeting recommendations by selected emotional reactions presents a promising direction for further research. As an additional contribution we share our experiences in designing and developing a real-world emotion-based recommendation engine, pointing to various challenges posed by the practical aspects of deploying emotion-based recommenders.

  相似文献   

15.
This research applies an innovative way to measure and identify user’s emotion with different ingredient color. How to find an intuitive way to understand human emotion is the key point in this research. The RGB color system that is widely used of all forms computer system is an accumulative color system in which red, green, and blue light are added together showing entire color. This study was based on Thayer’s emotion model which classifies the emotions with two vectors, valence and arousal, and gathers the emotion color with RGB as input for calculating and forecasting user’s emotion. In this experiment, using 320 data divide to quarter into emotion groups to train the weight in the neural network and uses 160 data to prove the accuracy. The result reveals that this model can be valid reckon the emotion by reply color response from user. In other hand, this experiment found that trend of the different ingredient of color on Cartesian coordinate system figures out the distinguishing intensity in RGB color system. Via the foregoing detect emotion model is going to design an affective computing intelligence framework try to embed the emotion component in it.  相似文献   

16.
Speech is an effective medium to express emotions and attitude through language. Finding the emotional content from a speech signal and identify the emotions from the speech utterances is an important task for the researchers. Speech emotion recognition has considered as an important research area over the last decade. Many researchers have been attracted due to the automated analysis of human affective behaviour. Therefore a number of systems, algorithms, and classifiers have been developed and outlined for the identification of emotional content of a speech from a person’s speech. In this study, available literature on various databases, different features and classifiers have been taken in to consideration for speech emotion recognition from assorted languages.  相似文献   

17.
Affective e-Learning in residential and pervasive computing environments   总被引:2,自引:1,他引:1  
This article examines how emerging pervasive computing and affective computing technologies might enhance the adoption of ICT in e-Learning which takes place in the home and wider city environment. In support of this vision we describe two cutting edge ICT environments which combine to form a holistic connected future learning environment. The first is the iSpace, a specialized digital-home test-bed that represents the kind of high-tech, context aware home-based learning environment we envisage future learners using, the second a sophisticated pervasive e-Learning platform that typifies the educational delivery platform our research is targeting. After describing these environments we then present our research that explores how emotion evolves during the learning process and how to leverage emotion feedback to provide adaptive e-Learning system. The motivation driving this work is our desire to improve the performance of the educational experience by developing learning systems that recognize and respond appropriately to emotions exhibited by learners. Finally we report on the results about the emotion recognition from physiological signals which achieved a best-case accuracy rate of 86.5% for four types of learning emotion. To the best of our knowledge, this is the first report on emotion detection by data collected from close-to-real-world learning sessions. We also report some finding about emotion evolution during learning, which are still not enough to validate Kort’s learning spiral model.
Ruimin ShenEmail:
  相似文献   

18.
With the growth of digital music, the development of music recommendation is helpful for users to pick desirable music pieces from a huge repository of music. The existing music recommendation approaches are based on a user’s preference on music. However, sometimes, it might better meet users’ requirement to recommend music pieces according to emotions. In this paper, we propose a novel framework for emotion-based music recommendation. The core of the recommendation framework is the construction of the music emotion model by affinity discovery from film music, which plays an important role in conveying emotions in film. We investigate the music feature extraction and propose the Music Affinity Graph and Music Affinity Graph-Plus algorithms for the construction of music emotion model. Experimental result shows the proposed emotion-based music recommendation achieves 85% accuracy in average.  相似文献   

19.
This paper deals in depth with some of the emotions that play a role in a group recommender system, which recommends sequences of items to a group of users. First, it describes algorithms to model and predict the satisfaction experienced by individuals. Satisfaction is treated as an affective state. In particular, we model the decay of emotion over time and assimilation effects, where the affective state produced by previous items influences the impact on satisfaction of the next item. We compare the algorithms with each other, and investigate the effect of parameter values by comparing the algorithms’ predictions with the results of an earlier empirical study. We discuss the difficulty of evaluating affective models, and present an experiment in a learning domain to show how some empirical evaluation can be done. Secondly, this paper proposes modifications to the algorithms to deal with the effect on an individual’s satisfaction of that of others in the group. In particular, we model emotional contagion and conformity, and consider the impact of different relationship types. Thirdly, this paper explores the issue of privacy (feeling safe, not accidentally disclosing private tastes to others in the group) which is related to the emotion of embarrassment. It investigates the effect on privacy of different group aggregation strategies and proposes to add a virtual member to the group to further improve privacy.  相似文献   

20.
An affective brain-computer interface (aBCI) is a direct communication pathway between human brain and computer, via which the computer tries to recognize the affective states of its user and respond accordingly. As aBCI introduces personal affective factors into human-computer interaction, it could potentially enrich the user’s experience during the interaction. Successful emotion recognition plays a key role in such a system. The state-of-the-art aBCIs leverage machine learning techniques which consist in acquiring affective electroencephalogram (EEG) signals from the user and calibrating the classifier to the affective patterns of the user. Many studies have reported satisfactory recognition accuracy using this paradigm. However, affective neural patterns are volatile over time even for the same subject. The recognition accuracy cannot be maintained if the usage of aBCI prolongs without recalibration. Existing studies have overlooked the performance evaluation of aBCI during long-term use. In this paper, we propose SAFE—an EEG dataset for stable affective feature selection. The dataset includes multiple recording sessions spanning across several days for each subject. Multiple sessions across different days were recorded so that the long-term recognition performance of aBCI can be evaluated. Based on this dataset, we demonstrate that the recognition accuracy of aBCIs deteriorates when re-calibration is ruled out during long-term usage. Then, we propose a stable feature selection method to choose the most stable affective features, for mitigating the accuracy deterioration to a lesser extent and maximizing the aBCI performance in the long run. We invite other researchers to test the performance of their aBCI algorithms on this dataset, and especially to evaluate the long-term performance of their methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号