首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 572 毫秒
1.
This paper presents a parallel real time framework for emotions and mental states extraction and recognition from video fragments of human movements. In the experimental setup human hands are tracked by evaluation of moving skin-colored objects. The tracking analysis demonstrates that acceleration and frequency characteristics of the traced objects are relevant for classification of the emotional expressiveness of human movements. The outcomes of the emotional and mental states recognition are cross-validated with the analysis of two independent certified movement analysts (CMA’s) who use the Laban movement analysis (LMA) method. We argue that LMA based computer analysis can serve as a common language for expressing and interpreting emotional movements between robots and humans, and in that way it resembles the common coding principle between action and perception by humans and primates that is embodied by the mirror neuron system. The solution is part of a larger project on interaction between a human and a humanoid robot with the aim of training social behavioral skills to autistic children with robots acting in a natural environment.  相似文献   

2.
The automatic tendency to anthropomorphize our interaction partners and make use of experience acquired in earlier interaction scenarios leads to the suggestion that social interaction with humanoid robots is more pleasant and intuitive than that with industrial robots. An objective method applied to evaluate the quality of human–robot interaction is based on the phenomenon of motor interference (MI). It claims that a face-to-face observation of a different (incongruent) movement of another individual leads to a higher variance in one’s own movement trajectory. In social interaction, MI is a consequence of the tendency to imitate the movement of other individuals and goes along with mutual rapport, sense of togetherness, and sympathy. Although MI occurs while observing a human agent, it disappears in case of an industrial robot moving with piecewise constant velocity. Using a robot with human-like appearance, a recent study revealed that its movements led to MI, only if they were based on human prerecording (biological velocity), but not on constant (artificial) velocity profile. However, it remained unclear, which aspects of the human prerecorded movement triggered MI: biological velocity profile or variability in movement trajectory. To investigate this issue, we applied a quasi-biological minimum-jerk velocity profile (excluding variability in the movement trajectory as an influencing factor of MI) to motion of a humanoid robot, which was observed by subjects performing congruent or incongruent arm movements. The increase in variability in subjects’ movements occurred both for the observation of a human agent and for the robot performing incongruent movements, suggesting that an artificial human-like movement velocity profile is sufficient to facilitate the perception of humanoid robots as interaction partners.  相似文献   

3.
Embodied robots are known to be preferable in most cases to their virtual agents for interaction with and performance by human subjects. This study compared the efficacy of an embodied robot-coacher and its virtual agent in involving preschool children in the performance of playlike motor tasks. The robot or its virtual agent demonstrated movements, asked the children to repeat them, and provided positive feedback on their performance. The difficulty of the motor tasks was increased over the course of the session. Two groups of children were studied, one of them with and the other without previous experience of interaction with the embodied robot. In the experienced group, involvement in motor tasks was successfully induced by both the embodied robot and its virtual agent, but the children interacted less well with the virtual agent than with the embodied robot. Children in the inexperienced group did not interact at all during the experiment with the virtual agent. Because participants in the experiment were preschool children in their natural environment, this study proposes the combined use of an embodied robot and its virtual agent for motor involvement.  相似文献   

4.
Research on humanoid robots has produced various uses for their body properties in communication. In particular, mutual relationships of body movements between a robot and a human are considered to be important for smooth and natural communication, as they are in human–human communication. We have developed a semi-autonomous humanoid robot system that is capable of cooperative body movements with humans using environment-based sensors and switching communicative units. Concretely, this system realizes natural communication by using typical behaviors such as: “nodding,” “eye-contact,” “face-to-face,” etc. It is important to note that the robot parts are NOT operated directly; only the communicative units in the robot system are switched. We conducted an experiment using the mentioned robot system and verified the importance of cooperative behaviors in a route-guidance situation where a human gives directions to the robot. The task requires a human participant (called the “speaker”) to teach a route to a “hearer” that is (1) a human, (2) a developed robot that performs cooperative movements, and (3) a robot that does not move at all. This experiment is subjectively evaluated through a questionnaire and an analysis of body movements using three-dimensional data from a motion capture system. The results indicate that the cooperative body movements greatly enhance the emotional impressions of human speakers in a route-guidance situation. We believe these results will allow us to develop interactive humanoid robots that sociably communicate with humans.  相似文献   

5.
This paper deals withspontaneous behavior for cooperation through interaction in a distributed autonomous robot system. Though a human gives the robots evaluation functions for the relation of cooperation among robots, each robot decides its behavior depending on its environment, its experience, and the behavior of other robots. The robot acquires a model of the behavior of the other robots through learning. Inspired by biological systems, the robot's behaviors are interpreted as emotional by an observer of the system. In psychology, the emotions have been considered to play important roles for generation of motivation and behavior selection. In this paper, the robot's behaviors are interpreted as follows: each robot feels frustration when its behavior decision does not fit its environment. Then, it changes its behavior to change its situation actively and spontaneously. The results show potential of intelligent behavior by emotions. This work was presented, in part, at the International Symposium on Artificial Life and Robotics, Oita, Japan, February 18–20, 1996  相似文献   

6.
An attentive robot needs to exhibit a plethora of different visual behaviors including free viewing, detecting visual onsets, search, remaining fixated and tracking depending on the vision task at hand. The robot’s associated camera movements—ranging from saccades to smooth pursuit—direct its optical axis in a manner that is dependent on the current visual behavior. This paper proposes a closed-loop dynamical systems approach to the generation of camera movements based on a family of artificial potential functions. Each movement from the current fixation point to the next is associated with an artificial potential function that encodes saliency and possibly inhibition depending on the visual behavior that the robot is engaged in. The novelty of this approach is that since the nature of resulting motion can vary from being saccadic to smooth pursuit, the full repertoire of visual behaviors all become possible within the same framework. The robot can switch its visual behavior simply by changing the parameters of the constructed artificial potential functions appropriately. Furthermore, automated reflexive changes among the different visual behaviors can be achieved via a simple switching automaton. Experimental results with APES robot serve to show the performance properties of a robot engaged in each different visual behavior.  相似文献   

7.
ABSTRACT

The design of humanoid robots’ emotional behaviors has attracted many scholars’ attention. However, users’ emotional responses to humanoid robots’ emotional behaviors which differ from robots’ traditional behaviors remain well understood. This study aims to investigate the effect of a humanoid robot’s emotional behaviors on users’ emotional responses using subjective reporting, pupillometry, and electroencephalography. Five categories of the humanoid robot’s emotional behaviors expressing joy, fear, neutral, sadness, or anger were designed, selected, and presented to users. Results show that users have a significant positive emotional response to the humanoid robot’s joy behavior and a significant negative emotional response to the humanoid robot’s sadness behavior, indicated by the metrics of reported valence and arousal, pupil diameter, frontal middle relative theta power, and frontal alpha asymmetry score. The results suggest that humanoid robot’s emotional behaviors can evocate users’ significant emotional response. The evocation might relate to the recognition of these emotional behaviors. In addition, the study provides a multimodal physiological method of evaluating users’ emotional responses to the humanoid robot’s emotional behaviors.  相似文献   

8.
Robots have been envisaged as both workers and partners of humans from the earliest period in their history. Therefore, robots should become artificial entities that can socially interact with human beings in social communities. Recent advances in technology have added various functions to robots. The development of actuators and grippers show us infinite possibilities for factory automation, and robots can now walk and perform very smoothly. All of these functions have been developed as solutions for improving robot movement and performance. However, there are many remaining problems in the communication between robots and humans. Communication robots provide one approach to the realization of embodied interfaces. Furthermore, the unsolved problems of human–robot communication can be clarified by adopting the concept of subtractive methods. In this article, we consider the minimal design of robots from the viewpoint of designing communication. By minimal design, we mean eliminating the nonessential portions and keeping only the most fundamental functions. We expect that the simple and clean nature of minimally designed objects will allow humans to interact with these robots without becoming uninterested too quickly. By exploiting the fact that humans have “a natural dislike for the absence of reasoning,” artificial entities built according to minimal design principles can extract the human drive to relate with others. We propose a method of designing a robot that has “character” and is situated in a social context from the viewpoint of minimal design. This work was presented in part at the 10th International Symposium on Artificial Life and Robotics, Oita, Japan, February 4–6, 2005  相似文献   

9.
Investigation into robot-assisted intervention for children with autism spectrum disorder (ASD) has gained momentum in recent years. Therapists involved in interventions must overcome the communication impairments generally exhibited by children with ASD by adeptly inferring the affective cues of the children to adjust the intervention accordingly. Similarly, a robot must also be able to understand the affective needs of these children—an ability that the current robot-assisted ASD intervention systems lack—to achieve effective interaction that addresses the role of affective states in human–robot interaction and intervention practice. In this paper, we present a physiology-based affect-inference mechanism for robot-assisted intervention where the robot can detect the affective states of a child with ASD as discerned by a therapist and adapt its behaviors accordingly. This paper is the first step toward developing “understanding” robots for use in future ASD intervention. Experimental results with six children with ASD from a proof-of-concept experiment (i.e., a robot-based basketball game) are presented. The robot learned the individual liking level of each child with regard to the game configuration and selected appropriate behaviors to present the task at his/her preferred liking level. Results show that the robot automatically predicted individual liking level in real time with 81.1% accuracy. This is the first time, to our knowledge, that the affective states of children with ASD have been detected via a physiology-based affect recognition technique in real time. This is also the first time that the impact of affect-sensitive closed-loop interaction between a robot and a child with ASD has been demonstrated experimentally.   相似文献   

10.
Recent experiments have shown the possibility to use the brain electrical activity to directly control the movement of robots. Such a kind of brain–computer interface is a natural way to augment human capabilities by providing a new interaction link with the outside world and is particularly relevant as an aid for paralysed humans, although it also opens up new possibilities in human–robot interaction for able-bodied people. One of these new fields of application is the use of brain–computer interfaces in the space environment, where astronauts are subject to extreme conditions and could greatly benefit from direct mental teleoperation of external semi-automatic manipulators—for instance, mental commands could be sent without any output/latency delays, as it is the case for manual control in microgravity conditions. Previous studies show that there is a considerable potential for this technology onboard spacecraft.  相似文献   

11.
If we are to achieve natural human–robot interaction, we may need to complement current vision and speech interfaces. Touch may provide us with an extra tool in this quest. In this paper we demonstrate the role of touch in interaction between a robot and a human. We show how infrared sensors located on robots can be easily used to detect and distinguish human interaction, in this case interaction with individual children. This application of infrared sensors potentially has many uses; for example, in entertainment or service robotics. This system could also benefit therapy or rehabilitation, where the observation and recording of movement and interaction is important. In the long term, this technique might enable robots to adapt to individuals or individual types of user.  相似文献   

12.
Service robots have been developed to assist nurses in routine patient services. Prior research has recognized that patient emotional experiences with robots may be as important as robot task performance in terms of user acceptance and assessments of effectiveness. The objective of this study was to understand the effect of different service robot interface features on elderly perceptions and emotional responses in a simulated medicine delivery task. Twenty-four participants sat in a simulated patient room and a service robot delivered a bag of “medicine” to them. Repeated trials were used to present variations on three robot features, including facial configuration, voice messaging and interactivity. Participant heart rate (HR) and galvanic skin response (GSR) were collected. Participant ratings of robot humanness [perceived anthropomorphism (PA)] were collected post-trial along with subjective ratings of arousal (bored–excited) and valence (unhappy–happy) using the self-assessment manikin (SAM) questionnaire. Results indicated the presence of all three types of robot features promoted higher PA, arousal and valence, compared to a control condition (a robot without any of the features). Participant physiological responses varied with events in their interaction with the robot. The three types of features also had different utility for stimulating participant arousal and valence, as well as physiological responses. In general, results indicated that adding anthropomorphic and interactive features to service robots promoted positive emotional responses [increased excitement (GSR) and happiness (HR)] in elderly users. It is expected that results from this study could be used as a basis for developing affective robot interface design guidelines to promote user emotional experiences.  相似文献   

13.
The present study investigates how children from two different cultural backgrounds (Pakistani, Dutch) and two different age groups (8 and 12 year olds) experience interacting with a social robot (iCat) during collaborative game play. We propose a new method to evaluate children’s interaction with such a robot, by asking whether playing a game with a state-of-the-art social robot like the iCat is more similar to playing this game alone or with a friend. A combination of self-report scores, perception test results and behavioral analyses indicate that child–robot interaction in game playing situations is highly appreciated by children, although more by Pakistani and younger children than by Dutch and older children. Results also suggest that children enjoyed playing with the robot more than playing alone, but enjoyed playing with a friend even more. In a similar vein, we found that children were more expressive in their non-verbal behavior when playing with the robot than when they were playing alone, but less expressive than when playing with a friend. Our results not only stress the importance of using new benchmarks for evaluating child–robot interaction but also highlight the significance of cultural differences for the design of social robots.  相似文献   

14.
ASD children are characterised by a lack of intentionality. We analysed nonverbal and verbal information, associated with heart rate and emotional feeling, respectively, in ASD and neurotypical children. Analogies of heart rate between ASD and neurotypical children were expressed when the human was ‘passive’ actor and the robot was ‘active’ actor; disanalogies were released when the human was ‘active’ actor. Only ASD children reported better emotional feeling ‘after’ than ‘before’ the interaction with the robot. These results suggest that ASD children might be more reliable to low-level intentionality represented by robots, than to high-level intentionality associated with humans.  相似文献   

15.
Due to safety requirements for Human-Robot Interaction (HRI), industrial robots have to meet high standards of safety requirements (ISO 10218). However, even if robots are incapable of causing serious physical harm, they still may influence people's mental and emotional wellbeing, as well as their trust, behaviour and performance in close collaboration. This work uses an HTC Vive Virtual Reality headset to study the potential of using robot control strategies to positively influence human post-accident behaviour. In the designed scenario, a virtual industrial robot first makes sudden unexpected movements, after which it either does or does not attempt to apologise for them. The results show that after the robot tries to communicate with the participants, the robot is reported to be less scary, more predictable and easier to work with. Furthermore, postural analysis shows that the participants who were the most affected by the robot's sudden movement recover 74% of their postural displacement within 60 s after the event if the robot apologised, and only 34% if it did not apologise. It is concluded, that apologies, which are commonly used as a trust-recovery strategy in social robotics, can positively influence people engaged with industrial robotics as well.Relevance to industryFindings can be used as guidelines for designing robot behaviour and trust-recovery control strategies meant to speed up human recovery after a trust-violating event in industrial Human-Robot Interaction.  相似文献   

16.
It is remarkable how much robotics research is promoted by appealing to the idea that the only way to deal with a looming demographic crisis is to develop robots to look after older persons. This paper surveys and assesses the claims made on behalf of robots in relation to their capacity to meet the needs of older persons. We consider each of the roles that has been suggested for robots in aged care and attempt to evaluate how successful robots might be in these roles. We do so from the perspective of writers concerned primarily with the quality of aged care, paying particular attention to the social and ethical implications of the introduction of robots, rather than from the perspective of robotics, engineering, or computer science. We emphasis the importance of the social and emotional needs of older persons—which, we argue, robots are incapable of meeting—in almost any task involved in their care. Even if robots were to become capable of filling some service roles in the aged-care sector, economic pressures on the sector would most likely ensure that the result was a decrease in the amount of human contact experienced by older persons being cared for, which itself would be detrimental to their well-being. This means that the prospects for the ethical use of robots in the aged-care sector are far fewer than first appears. More controversially, we believe that it is not only misguided, but actually unethical, to attempt to substitute robot simulacra for genuine social interaction. A subsidiary goal of this paper is to draw attention to the discourse about aged care and robotics and locate it in the context of broader social attitudes towards older persons. We conclude by proposing a deliberative process involving older persons as a test for the ethics of the use of robots in aged care.We dedicate this paper to the memory of Jean Woodroffe, whose strength and courage at the end of her life journey inspired the authors’ interest in aged-care issues.  相似文献   

17.
《Advanced Robotics》2013,27(18):2233-2254
Robots are increasingly being used in domestic environments and should be able to interact with inexperienced users. Human–human interaction and human–computer interaction research findings are relevant, but often limited because robots are different from both humans and computers. Therefore, new human–robot interaction (HRI) research methods can inform the design of robots suitable for inexperienced users. A video-based HRI (VHRI) methodology was here used to carry out a multi-national HRI user study for the prototype domestic robot BIRON (BIelefeld RObot companioN). Previously, the VHRI methodology was used in constrained HRI situations, while in this study HRIs involved a series of events as part of a 'hometour' scenario. Thus, the present work is the first study of this methodology in extended HRI contexts with a multi-national approach. Participants watched videos of the robot interacting with a human actor and rated two robot behaviors (Extrovert and Introvert). Participants' perceptions and ratings of the robot's behaviors differed with regard to both verbal interactions and person following by the robot. The study also confirms that the VHRI methodology provides a valuable means to obtain early user feedback, even before fully working prototypes are available. This can usefully guide the future design work on robots, and associated verbal and non-verbal behaviors.  相似文献   

18.
This article describes a methodology, together with an associated series of experiments employing this methodology, for the evolution of walking behavior in a simulated humanoid robot with up to 20 degrees of freedom. The robots evolved in this study learn to walk smoothly in an upright or near-upright position and demonstrate a variety of different locomotive behaviors, including “skating,” “limping,” and walking in a manner curiously reminiscent of a mildly or heavily intoxicated person. A previous study demonstrated the possible potential utility of this approach while evolving controllers based on simulated humanoid robots with a restricted range of movements. Although walking behaviors were developed, these were slow and relied on the robot walking in an excessively stooped position, similar to the gait of an infirm elderly person. This article extends the previous work to a robot with many degrees of freedom, up to 20 in total (arms, elbows, legs, hips, knees, etc.), and demonstrates the automatic evolution of fully upright bipedal locomotion in a humanoid robot using an accurate physics simulator. This work was presented in part at the 11th International Symposium on Artificial Life and Robotics, Oita, Japan, January 23–25, 2006  相似文献   

19.
As robots move into more human centric environments we require methods to develop robots that can naturally interact with humans. Doing so requires testing in the real-world and addressing multidisciplinary challenges. Our research is focused on child–robot interaction which includes very young children, for example toddlers, and children diagnosed with autism. More traditional forms of human–robot communication, such as speech or gesture recognition, may not be appropriate with these users, where as touch may help to provide a more natural and appropriate means of communication for such instances. In this paper, we present our findings on these topics obtained from a project involving a spherical robot that acquires information regarding natural touch from analysing sensory patterns over-time to characterize the information. More specifically, from this project we have derived important factors for future consideration, we describe our iterative experimental methodology of testing in and out of the ‘wild’ (lab based and real world), and outline discoveries that were made by doing so.  相似文献   

20.
Robotic technology is quickly evolving allowing robots to perform more complex tasks in less structured environments with more flexibility and autonomy. Heterogeneous multi-robot teams are more common as the specialized abilities of individual robots are used in concert to achieve tasks more effectively and efficiently. An important area of research is the use of robot teams to perform modular assemblies. To this end, this paper analyzed the relative performance of two robots with different morphologies and attributes in performing an assembly task autonomously under different coordination schemes using force sensing through a control basis approach. A rigid, point-to-point manipulator and a dual-armed pneumatically actuated humanoid robot performed the assembly of parts under a traditional “push-hold” coordination scheme and a human-mimicked “push-push” scheme. The study revealed that the scheme with higher level of cooperation—the “push-push” scheme—performed assemblies faster and more reliably, lowering the likelihood of stiction phenomena, jamming, and wedging. The study also revealed that in “push-hold” schemes industrial robots are better pushers and compliant robots are better holders. The results of our study affirm the use of heterogeneous robots to perform hard-to-do assemblies and also encourage humans to function as holder’s when working in concert with a robot assistant for insertion tasks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号