File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Leveraging appearance priors in non-rigid registration, with application to manipulation of deformable objects

TitleLeveraging appearance priors in non-rigid registration, with application to manipulation of deformable objects
Authors
Issue Date2015
PublisherIEEE.
Citation
The 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015), Hamburg, Germany, 28 September-2 October 2015. In Conference Proccedings, 2015, p. 878-885 How to Cite?
AbstractManipulation of deformable objects is a widely applicable but challenging task in robotics. One promising nonparametric approach for this problem is trajectory transfer, in which a non-rigid registration is computed between the starting scene of the demonstration and the scene at test time. This registration is extrapolated to find a function from ℝ 3 to ℝ 3 , which is then used to warp the demonstrated robot trajectory to generate a proposed trajectory to execute in the test scene. In prior work [1] [2], only depth information from the scenes has been used to compute this warp function. This approach ignores appearance information, but there are situations in which using both shape and appearance information is necessary for finding high quality non-rigid warp functions. In this paper, we describe an approach to learn relevant appearance information about deformable objects using deep learning, and use this additional information to improve the quality of non-rigid registration between demonstration and test scenes. Our method better registers areas of interest on deformable objects that are crucial for manipulation, such as rope crossings and towel corners and edges. We experimentally validate our approach in both simulation and in the real world, and show that the utilization of appearance information leads to a significant improvement in both selecting the best matching demonstration scene for a given test scene, and finding a high quality non-rigid registration between those two scenes.
Persistent Identifierhttp://hdl.handle.net/10722/211510
ISBN
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorHuang, SH-
dc.contributor.authorPan, J-
dc.contributor.authorMulcaire, G-
dc.contributor.authorAbbeel, P-
dc.date.accessioned2015-07-16T02:21:27Z-
dc.date.available2015-07-16T02:21:27Z-
dc.date.issued2015-
dc.identifier.citationThe 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015), Hamburg, Germany, 28 September-2 October 2015. In Conference Proccedings, 2015, p. 878-885-
dc.identifier.isbn9781479999941-
dc.identifier.urihttp://hdl.handle.net/10722/211510-
dc.description.abstractManipulation of deformable objects is a widely applicable but challenging task in robotics. One promising nonparametric approach for this problem is trajectory transfer, in which a non-rigid registration is computed between the starting scene of the demonstration and the scene at test time. This registration is extrapolated to find a function from ℝ 3 to ℝ 3 , which is then used to warp the demonstrated robot trajectory to generate a proposed trajectory to execute in the test scene. In prior work [1] [2], only depth information from the scenes has been used to compute this warp function. This approach ignores appearance information, but there are situations in which using both shape and appearance information is necessary for finding high quality non-rigid warp functions. In this paper, we describe an approach to learn relevant appearance information about deformable objects using deep learning, and use this additional information to improve the quality of non-rigid registration between demonstration and test scenes. Our method better registers areas of interest on deformable objects that are crucial for manipulation, such as rope crossings and towel corners and edges. We experimentally validate our approach in both simulation and in the real world, and show that the utilization of appearance information leads to a significant improvement in both selecting the best matching demonstration scene for a given test scene, and finding a high quality non-rigid registration between those two scenes.-
dc.languageeng-
dc.publisherIEEE.-
dc.relation.ispartof2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)-
dc.titleLeveraging appearance priors in non-rigid registration, with application to manipulation of deformable objects-
dc.typeConference_Paper-
dc.identifier.emailPan, J: jpan@cs.hku.hk-
dc.identifier.authorityPan, J=rp01984-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/IROS.2015.7353475-
dc.identifier.scopuseid_2-s2.0-84958161193-
dc.identifier.hkuros244992-
dc.identifier.spage878-
dc.identifier.epage885-
dc.identifier.isiWOS:000371885401006-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats