File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Hyperspectrum image fusion for sensor guided mobile manipulations

TitleHyperspectrum image fusion for sensor guided mobile manipulations
Authors
Issue Date2012
Citation
2012 IEEE International Conference on Robotics and Biomimetics, ROBIO 2012 - Conference Digest, 2012, p. 2288-2293 How to Cite?
AbstractThe hyperspectrum images from across the spectrum collect object characteristics and are capable of performing target recognition in the environments and target localization autonomously. In this paper, hyperspectrum image fusion for mobile manipulation is proposed. The fusion sensor including a single camera and an IR projector runs in two spectral bands (visible and IR) alternatively. After color images of the environments are captured from visible light, the region of interest (ROI) of the target will be recognized. Then in the ROI the target will be localized with the one-shot images of the pattern from IR light. A stereo fusion vision model of the fusion sensor and its calibration algorithm with a high precision are also presented. The localization is performed by 3D reconstruction based on the calibrated system. The calibration precision of the fusion sensor is [0.40, 0.24, 1.39]mm in x, y, z direction respectively. A manipulation experiment of grasping a handcream tub is carried out on the mobile platform and the experiment results verify the effectiveness of the hyperspectrum image fusion. © 2012 IEEE.
Persistent Identifierhttp://hdl.handle.net/10722/213305

 

DC FieldValueLanguage
dc.contributor.authorLi, Weixian-
dc.contributor.authorZhang, Chi-
dc.contributor.authorXi, Ning-
dc.contributor.authorJia, Yunyi-
dc.date.accessioned2015-07-28T04:06:50Z-
dc.date.available2015-07-28T04:06:50Z-
dc.date.issued2012-
dc.identifier.citation2012 IEEE International Conference on Robotics and Biomimetics, ROBIO 2012 - Conference Digest, 2012, p. 2288-2293-
dc.identifier.urihttp://hdl.handle.net/10722/213305-
dc.description.abstractThe hyperspectrum images from across the spectrum collect object characteristics and are capable of performing target recognition in the environments and target localization autonomously. In this paper, hyperspectrum image fusion for mobile manipulation is proposed. The fusion sensor including a single camera and an IR projector runs in two spectral bands (visible and IR) alternatively. After color images of the environments are captured from visible light, the region of interest (ROI) of the target will be recognized. Then in the ROI the target will be localized with the one-shot images of the pattern from IR light. A stereo fusion vision model of the fusion sensor and its calibration algorithm with a high precision are also presented. The localization is performed by 3D reconstruction based on the calibrated system. The calibration precision of the fusion sensor is [0.40, 0.24, 1.39]mm in x, y, z direction respectively. A manipulation experiment of grasping a handcream tub is carried out on the mobile platform and the experiment results verify the effectiveness of the hyperspectrum image fusion. © 2012 IEEE.-
dc.languageeng-
dc.relation.ispartof2012 IEEE International Conference on Robotics and Biomimetics, ROBIO 2012 - Conference Digest-
dc.titleHyperspectrum image fusion for sensor guided mobile manipulations-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/ROBIO.2012.6491310-
dc.identifier.scopuseid_2-s2.0-84876485012-
dc.identifier.spage2288-
dc.identifier.epage2293-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats