首页 | 本学科首页   官方微博 | 高级检索  
     


A versatile interaction framework for robot programming based on hand gestures and poses
Affiliation:1. State Key Laboratory of Mechanical Transmission, Chongqing University, Chongqing, 400044, China;2. State Key Laboratory of Public Big Data, Guizhou University, Guiyang, 550025, China;1. School of Mechatronic Engineering and Automation, Shanghai University, Shanghai, 200444, China;2. Department of Mechanical and Mechatronics Engineering, The University of Auckland, Auckland, 1010, New Zealand;3. Department of Aerospace Engineering, Toronto Metropolitan University, Toronto, ON M5B 2K3, Canada;4. Shanghai Key Laboratory of Intelligent Manufacturing and Robotics, Shanghai, 200072, China;1. School of Mechanical Engineering, Tongji University, Shanghai 201804, China;2. Siemens Technology, Siemens Ltd., China;1. Advanced Remanufacturing and Technology Centre (ARTC), A*STAR, 3 Cleantech Loop, 637143, Singapore;2. School of Mechanical and Aerospace Engineering, Nanyang Technological University, 639798, Singapore;3. Institute of Intelligent Manufacturing, Guangdong Academy of Sciences, Guangzhou, 510070, China;4. Singapore Institute of Manufacturing Technology (SIMTech), A*STAR, 5 Cleantech Loop, 636732, Singapore;1. School of Engineering, The University of Warwick, Coventry CV4 7AL, UK;2. School of Marine Science and Technology, Tianjin University, Tianjin 300072, PR China;1. The State Key Laboratory of Tribology in Advanced Equipment, Department of Mechanical Engineering (DME), Tsinghua University, Beijing 100084, China;2. Beijing Key Lab of Precision/Ultra-precision Manufacturing Equipments and Control, Tsinghua University, Beijing 100084, China
Abstract:This paper proposes a framework for industrial and collaborative robot programming based on the integration of hand gestures and poses. The framework allows operators to control the robot via both End-Effector (EE) and joint movements and to transfer compound shapes accurately to the robot. Seventeen hand gestures, which cover the position and orientation controls of the robotic EE and other auxiliary operations, are designed according to cognitive psychology. Gestures are classified by a deep neural network, which is pre-trained for two-hand pose estimation and fine-tuned on a custom dataset, achieving a test accuracy of 99%. The index finger’s pointing direction and the hand’s orientation are extracted via 3D hand pose estimation to indicate the robotic EE’s moving direction and orientation, respectively. The number of stretched fingers is detected via two-hand pose estimation to represent decimal digits for selecting robot joints and inputting numbers. Finally, we integrate these three manners seamlessly to form a programming framework.We conducted two interaction experiments. The reaction time of the proposed hand gestures in indicating randomly given instructions is significantly less than that of other gesture sets, such as American Sign Language (ASL). The accuracy of our method in compound shape reconstruction is much better than that of hand movement trajectory-based methods, and the operating time is comparable with that of teach pendants.
Keywords:Robot programming  Human–robot interaction  Hand gesture dataset  Hand gesture recognition  Hand pose estimation
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号