File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Failure Handling of Robotic Pick and Place Tasks With Multimodal Cues Under Partial Object Occlusion

TitleFailure Handling of Robotic Pick and Place Tasks With Multimodal Cues Under Partial Object Occlusion
Authors
Keywordssoft robot applications
pick and place
failure handling
visual tracking
proprioception
Issue Date2021
PublisherFrontiers Research Foundation. The Journal's web site is located at https://www.frontiersin.org/journals/neurorobotics
Citation
Frontiers in Neurorobotics, 2021, v. 15, p. article no. 570507 How to Cite?
AbstractThe success of a robotic pick and place task depends on the success of the entire procedure: from the grasp planning phase, to the grasp establishment phase, then the lifting and moving phase, and finally the releasing and placing phase. Being able to detect and recover from grasping failures throughout the entire process is therefore a critical requirement for both the robotic manipulator and the gripper, especially when considering the almost inevitable object occlusion by the gripper itself during the robotic pick and place task. With the rapid rising of soft grippers, which rely heavily on their under-actuated body and compliant, open-loop control, less information is available from the gripper for effective overall system control. Tackling on the effectiveness of robotic grasping, this work proposes a hybrid policy by combining visual cues and proprioception of our gripper for the effective failure detection and recovery in grasping, especially using a proprioceptive self-developed soft robotic gripper that is capable of contact sensing. We solved failure handling of robotic pick and place tasks and proposed (1) more accurate pose estimation of a known object by considering the edge-based cost besides the image-based cost; (2) robust object tracking techniques that work even when the object is partially occluded in the system and achieve mean overlap precision up to 80%; (3) contact and contact loss detection between the object and the gripper by analyzing internal pressure signals of our gripper; (4) robust failure handling with the combination of visual cues under partial occlusion and proprioceptive cues from our soft gripper to effectively detect and recover from different accidental grasping failures. The proposed system was experimentally validated with the proprioceptive soft robotic gripper mounted on a collaborative robotic manipulator, and a consumer-grade RGB camera, showing that combining visual cues and proprioception from our soft actuator robotic gripper was effective in improving the detection and recovery from the major grasping failures in different stages for the compliant and robust grasping.
Persistent Identifierhttp://hdl.handle.net/10722/300603
ISSN
2023 Impact Factor: 2.6
2023 SCImago Journal Rankings: 0.676
PubMed Central ID
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZHU, F-
dc.contributor.authorWANG, L-
dc.contributor.authorWEN, Y-
dc.contributor.authorYang, L-
dc.contributor.authorPan, J-
dc.contributor.authorWang, Z-
dc.contributor.authorWang, W-
dc.date.accessioned2021-06-18T14:54:21Z-
dc.date.available2021-06-18T14:54:21Z-
dc.date.issued2021-
dc.identifier.citationFrontiers in Neurorobotics, 2021, v. 15, p. article no. 570507-
dc.identifier.issn1662-5218-
dc.identifier.urihttp://hdl.handle.net/10722/300603-
dc.description.abstractThe success of a robotic pick and place task depends on the success of the entire procedure: from the grasp planning phase, to the grasp establishment phase, then the lifting and moving phase, and finally the releasing and placing phase. Being able to detect and recover from grasping failures throughout the entire process is therefore a critical requirement for both the robotic manipulator and the gripper, especially when considering the almost inevitable object occlusion by the gripper itself during the robotic pick and place task. With the rapid rising of soft grippers, which rely heavily on their under-actuated body and compliant, open-loop control, less information is available from the gripper for effective overall system control. Tackling on the effectiveness of robotic grasping, this work proposes a hybrid policy by combining visual cues and proprioception of our gripper for the effective failure detection and recovery in grasping, especially using a proprioceptive self-developed soft robotic gripper that is capable of contact sensing. We solved failure handling of robotic pick and place tasks and proposed (1) more accurate pose estimation of a known object by considering the edge-based cost besides the image-based cost; (2) robust object tracking techniques that work even when the object is partially occluded in the system and achieve mean overlap precision up to 80%; (3) contact and contact loss detection between the object and the gripper by analyzing internal pressure signals of our gripper; (4) robust failure handling with the combination of visual cues under partial occlusion and proprioceptive cues from our soft gripper to effectively detect and recover from different accidental grasping failures. The proposed system was experimentally validated with the proprioceptive soft robotic gripper mounted on a collaborative robotic manipulator, and a consumer-grade RGB camera, showing that combining visual cues and proprioception from our soft actuator robotic gripper was effective in improving the detection and recovery from the major grasping failures in different stages for the compliant and robust grasping.-
dc.languageeng-
dc.publisherFrontiers Research Foundation. The Journal's web site is located at https://www.frontiersin.org/journals/neurorobotics-
dc.relation.ispartofFrontiers in Neurorobotics-
dc.rightsThis Document is Protected by copyright and was first published by Frontiers. All rights reserved. It is reproduced with permission.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectsoft robot applications-
dc.subjectpick and place-
dc.subjectfailure handling-
dc.subjectvisual tracking-
dc.subjectproprioception-
dc.titleFailure Handling of Robotic Pick and Place Tasks With Multimodal Cues Under Partial Object Occlusion-
dc.typeArticle-
dc.identifier.emailYang, L: lyang125@HKUCC-COM.hku.hk-
dc.identifier.emailPan, J: jpan@cs.hku.hk-
dc.identifier.emailWang, Z: zwangski@hku.hk-
dc.identifier.emailWang, W: wenping@cs.hku.hk-
dc.identifier.authorityPan, J=rp01984-
dc.identifier.authorityWang, Z=rp01915-
dc.identifier.authorityWang, W=rp00186-
dc.description.naturepublished_or_final_version-
dc.identifier.doi10.3389/fnbot.2021.570507-
dc.identifier.pmid33762921-
dc.identifier.pmcidPMC7982538-
dc.identifier.scopuseid_2-s2.0-85102900725-
dc.identifier.hkuros323035-
dc.identifier.volume15-
dc.identifier.spagearticle no. 570507-
dc.identifier.epagearticle no. 570507-
dc.identifier.isiWOS:000631072000001-
dc.publisher.placeSwitzerland-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats