首页 | 本学科首页   官方微博 | 高级检索  
     


Teeth recognition based on multiple attempts in mobile device
Affiliation:1. Department of Mechanical Engineering/Faculty of Engineering and School of Dentistry/Faculty of Medicine and Dentistry, University of Alberta, Edmonton, Alberta, Canada;2. Department of Civil Engineering/Faculty of Engineering, University of Alberta, Edmonton, Alberta, Canada;3. Centre for Oral, Clinical & Translational Sciences, King''s College London, London, United Kingdom;4. Department of Civil Engineering/Faculty of Engineering, University of Alberta, Edmonton, Alberta, Canada;1. Department of Mathematics, National Institute of Technology Puducherry, Karaikal 609609, India;2. Department of Mathematics, Faculty of Arts and Sciences, Ondokuz Mayis University, Atakum, 55200 Samsun, Turkey
Abstract:Most traditional biometric approaches generally utilize a single image for personal identification. However, these approaches sometimes failed to recognize users in practical environment due to false-detected or undetected subject. Therefore, this paper proposes a novel recognition approach based on multiple frame images that are implemented in mobile devices. The aim of this paper is to improve the recognition accuracy and to reduce computational complexity through multiple attempts. Here, multiple attempts denote that multiple frame images are used in time of recognition procedure. Among sequential frame images, an adequate subject, i.e., teeth image, is chosen by subject selection module which is operated based on differential image entropy. The selected subject is then utilized as a biometric trait of traditional recognition algorithms including PCA, LDA, and EHMM. The performance evaluation of proposed method is performed using two teeth databases constructed by a mobile device. Through experimental results, we confirm that the proposed method exhibits improved recognition accuracy of about 3.6–4.8%, and offers the advantage of lower computational complexity than traditional biometric approaches.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号