File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Rigid-Soft Interactive Learning for Robust Grasping

TitleRigid-Soft Interactive Learning for Robust Grasping
Authors
KeywordsRobots
Grasping
Grippers
Benchmark testing
Learning systems
Issue Date2020
PublisherInstitute of Electrical and Electronics Engineers. The Journal's web site is located at https://www.ieee.org/membership-catalog/productdetail/showProductDetailPage.html?product=PER481-ELE
Citation
IEEE Robotics and Automation Letters, 2020, v. 5 n. 2, p. 1720-1727 How to Cite?
AbstractRobot learning is widely accepted by academia and industry with its potentials to transform autonomous robot control through machine learning. Inspired by widely used soft fingers on grasping, we propose a method of rigid-soft interactive learning, aiming at reducing the time of data collection. In this letter, we classify the interaction categories into Rigid-Rigid, Rigid-Soft, SoftRigid according to the interaction surface between grippers and target objects. We find experimental evidence that the interaction types between grippers and target objects play an essential role in the learning methods. We use soft, stuffed toys for training, instead of everyday objects, to reduce the integration complexity and computational burden. Although the stuffed toys are limited in reflecting the physics of finger-object interaction in real-life scenarios, we exploit such rigid-soft interaction by changing the gripper fingers to the soft ones when dealing with rigid, daily-life items such as the Yale-CMU-Berkeley (YCB) objects. With a small data collection of 5 K picking attempts in total, our results suggest that such Rigid-Soft and Soft-Rigid interactions are transferable. Moreover, the combination of such interactions shows better performance on the grasping test. We also explore the effect of the grasp type on the learning method by changing the gripper configurations. We achieve the best grasping performance at 97.5% for easy YCB objects and 81.3% for difficult YCB objects while using a precise grasp with a two-soft-finger gripper to collect training data and power grasp with a four-soft-finger gripper to test the grasp policy.
Persistent Identifierhttp://hdl.handle.net/10722/285106
ISSN
2021 Impact Factor: 4.321
2020 SCImago Journal Rankings: 1.123
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorYANG, L-
dc.contributor.authorWAN, F-
dc.contributor.authorWANG, H-
dc.contributor.authorLIU, X-
dc.contributor.authorLIU, Y-
dc.contributor.authorPan, J-
dc.contributor.authorSONG, C-
dc.date.accessioned2020-08-07T09:06:50Z-
dc.date.available2020-08-07T09:06:50Z-
dc.date.issued2020-
dc.identifier.citationIEEE Robotics and Automation Letters, 2020, v. 5 n. 2, p. 1720-1727-
dc.identifier.issn2377-3766-
dc.identifier.urihttp://hdl.handle.net/10722/285106-
dc.description.abstractRobot learning is widely accepted by academia and industry with its potentials to transform autonomous robot control through machine learning. Inspired by widely used soft fingers on grasping, we propose a method of rigid-soft interactive learning, aiming at reducing the time of data collection. In this letter, we classify the interaction categories into Rigid-Rigid, Rigid-Soft, SoftRigid according to the interaction surface between grippers and target objects. We find experimental evidence that the interaction types between grippers and target objects play an essential role in the learning methods. We use soft, stuffed toys for training, instead of everyday objects, to reduce the integration complexity and computational burden. Although the stuffed toys are limited in reflecting the physics of finger-object interaction in real-life scenarios, we exploit such rigid-soft interaction by changing the gripper fingers to the soft ones when dealing with rigid, daily-life items such as the Yale-CMU-Berkeley (YCB) objects. With a small data collection of 5 K picking attempts in total, our results suggest that such Rigid-Soft and Soft-Rigid interactions are transferable. Moreover, the combination of such interactions shows better performance on the grasping test. We also explore the effect of the grasp type on the learning method by changing the gripper configurations. We achieve the best grasping performance at 97.5% for easy YCB objects and 81.3% for difficult YCB objects while using a precise grasp with a two-soft-finger gripper to collect training data and power grasp with a four-soft-finger gripper to test the grasp policy.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers. The Journal's web site is located at https://www.ieee.org/membership-catalog/productdetail/showProductDetailPage.html?product=PER481-ELE-
dc.relation.ispartofIEEE Robotics and Automation Letters-
dc.rightsIEEE Robotics and Automation Letters. Copyright © Institute of Electrical and Electronics Engineers.-
dc.rights©20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.-
dc.subjectRobots-
dc.subjectGrasping-
dc.subjectGrippers-
dc.subjectBenchmark testing-
dc.subjectLearning systems-
dc.titleRigid-Soft Interactive Learning for Robust Grasping-
dc.typeArticle-
dc.identifier.emailPan, J: jpan@cs.hku.hk-
dc.identifier.authorityPan, J=rp01984-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/LRA.2020.2969932-
dc.identifier.scopuseid_2-s2.0-85079797690-
dc.identifier.hkuros312150-
dc.identifier.volume5-
dc.identifier.issue2-
dc.identifier.spage1720-
dc.identifier.epage1727-
dc.identifier.isiWOS:000526520500007-
dc.publisher.placeUnited States-
dc.identifier.issnl2377-3766-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats