Facial expression recognition and tracking for intelligent human-robot interaction |
| |
Authors: | Y Yang S S Ge T H Lee C Wang |
| |
Affiliation: | (1) Social Robotics Lab, Interactive Digital Media Institute and Department of Electrical Computer Engineering, National University of Singapore, Singapore, 117576, Singapore |
| |
Abstract: | For effective interaction between humans and socially adept, intelligent service robots, a key capability required by this
class of sociable robots is the successful interpretation of visual data. In addition to crucial techniques like human face
detection and recognition, an important next step for enabling intelligence and empathy within social robots is that of emotion
recognition. In this paper, an automated and interactive computer vision system is investigated for human facial expression
recognition and tracking based on the facial structure features and movement information. Twenty facial features are adopted
since they are more informative and prominent for reducing the ambiguity during classification. An unsupervised learning algorithm,
distributed locally linear embedding (DLLE), is introduced to recover the inherent properties of scattered data lying on a
manifold embedded in high-dimensional input facial images. The selected person-dependent facial expression images in a video
are classified using the DLLE. In addition, facial expression motion energy is introduced to describe the facial muscle’s
tension during the expressions for person-independent tracking for person-independent recognition. This method takes advantage
of the optical flow which tracks the feature points’ movement information. Finally, experimental results show that our approach
is able to separate different expressions successfully. |
| |
Keywords: | Human– Robot interaction Facial expression recognition Affective computing Distributed locally linear embedding Facial expression motion energy |
本文献已被 SpringerLink 等数据库收录! |
|