首页 | 本学科首页   官方微博 | 高级检索  
     


Grasping objects localized from uncertain point cloud data
Affiliation:1. Institut des Systèmes Intelligents et de Robotique, Université Pierre et Marie Curie, ISIR - CNRS UMR 7222, Boite courrier 173, 4 Place Jussieu, 75252 Paris cedex 05, France;2. Intelligent Autonomous Systems Lab., TU Darmstadt, Germany;1. Duke University, Department of Mechanical Engineering and Materials Science, 144 Hudson Hall, Box 90300, Durham, NC 27708, United States;2. Duke University, Durham, NC, United States;3. Laboratoire d’Informatique pour la Mécanique et les Sciences de l’Ingénieur, Orsay, France;4. Sandia National Laboratories, Livermore, CA, United States
Abstract:Robotic grasping is very sensitive to how accurate is the pose estimation of the object to grasp. Even a small error in the estimated pose may cause the planned grasp to fail. Several methods for robust grasp planning exploit the object geometry or tactile sensor feedback. However, object pose range estimation introduces specific uncertainties that can also be exploited to choose more robust grasps. We present a grasp planning method that explicitly considers the uncertainties on the visually-estimated object pose. We assume a known shape (e.g. primitive shape or triangle mesh), observed as a–possibly sparse–point cloud. The measured points are usually not uniformly distributed over the surface as the object is seen from a particular viewpoint; additionally this non-uniformity can be the result of heterogeneous textures over the object surface, when using stereo-vision algorithms based on robust feature-point matching. Consequently the pose estimation may be more accurate in some directions and contain unavoidable ambiguities.The proposed grasp planner is based on a particle filter to estimate the object probability distribution as a discrete set. We show that, for grasping, some ambiguities are less unfavorable so the distribution can be used to select robust grasps. Some experiments are presented with the humanoid robot iCub and its stereo cameras.
Keywords:Robotic grasping  Multi-fingered hand  Inverse kinematics
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号