首页 | 本学科首页   官方微博 | 高级检索  
     


A deep learning-enabled human-cyber-physical fusion method towards human-robot collaborative assembly
Affiliation:1. School of Mechanical Engineering, Xi''an Jiaotong University, Xi''an 710049, China;2. State Key Laboratory for Manufacturing Systems Engineering, Xi''an Jiaotong University, Xi''an 710054, China;3. School of Science, Xi''an University of Architecture and Technology, Xi''an 710055, China;1. College of Engineering and Physical Sciences, Aston University, Birmingham, B47ET, UK;2. Department of Mechanical Engineering, The University of Auckland, Auckland, 1010, New Zealand;3. Department of Mechanical and Aerospace Engineering, Case Western Reserve University, Cleveland, OH 44106, USA;4. Department of Production Engineering, KTH Royal Institute of Technology, Stockholm, Sweden;5. Institute for Control Engineering of Machine Tools and Manufacturing Units (ISW), University of Stuttgart, Stuttgart, 70174, Germany;1. Key Laboratory of Road Construction Technology and Equipment of MOE, Chang''an University, Xi''an 710064, China;2. School of Construction Machinery, Chang''an University, Xi''an, 710064, China;3. Department of Mechanical Engineering, University of Alberta, Edmonton, AB T6G 2E1, Canada;4. Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, 518055, China;1. School of Mechanical Engineering, Shandong University, Jinan 250061, PR China;2. Key Laboratory of High Efficiency and Clean Mechanical Manufacture at Shandong University, Ministry of Education, Jinan 250061, PR China;1. School of Control Science and Engineering, Shandong University, 17923 Jingshi Road, Jinan 250061, PR. China;2. Engineering Research Center of Intelligent Unmanned System, Ministry of Education, Jinan 250061, PR. China;1. Hubei Key Laboratory of Advanced Technology for Automotive Components, Wuhan University of Technology, Wuhan 430070, China;2. Hubei Collaborative Innovation Center for Automotive Components Technology, Wuhan University of Technology, Wuhan 430070, China;3. Hubei Longzhong Laboratory, Xiangyang 441000, China
Abstract:Human-robot collaborative (HRC) assembly has become popular in recent years. It takes full advantage of the strength, repeatability and accuracy of robots and the high-level cognition, flexibility and adaptability of humans to achieve an ergonomic working environment with better overall productivity. However, HRC assembly is still in its infancy nowadays. How to ensure the safety and efficiency of HRC assembly while reducing assembly failures caused by human errors is challenging. To address the current challenges, this paper proposes a novel human-cyber-physical assembly system (HCPaS) framework, which combines the powerful perception and control capacity of digital twin with the virtual-reality interaction capacity of augmented reality (AR) to achieve a safe and efficient HRC environment. Based on the framework, a deep learning-enabled fusion method of HCPaS is proposed from the perspective of robot-level fusion and part-level fusion. Robot-level fusion perceives the pose of robots with the combination of PointNet and iterative closest point (ICP) algorithm, where the status of robots together with their surroundings could be registered into AR environment to improve the human's cognitive ability of complex assembly environment, thus ensuring the safe HRC assembly. Part-level fusion recognizes the type and pose of parts being assembled with a parallel network that takes an extended Pixel-wise Voting Network (PVNet) as the base architecture, on which assembly sequence/process information of the part could be registered into AR environment to provide smart guidance for manual work to avoid human errors. Eventually, experimental results demonstrate the effectiveness and efficiency of the approach.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号