File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Temporal and spatial sensor fusion in a robotic manufacturing workcell

TitleTemporal and spatial sensor fusion in a robotic manufacturing workcell
Authors
Issue Date1995
Citation
Proceedings - IEEE International Conference on Robotics and Automation, 1995, v. 1, p. 160-165 How to Cite?
AbstractIn this paper, we discuss the problem of using visual and other sensors in the manipulation of a part by a robotic manipulator in a manufacturing workcell. Our emphasis is on the part localization problem involved. We introduce a new sensor-fusion approach which fuses sensory information from different sensors at various spatial and temporal scales. Relative spatial information obtained from processing of visual information is mapped to absolute taskspace of the robot through fusing of information from an encoder. Data obtained this way can be superimposed upon data obtained from displacement based vision algorithms at coarser time scales to improve overall reliability. Tracking plans reflecting sensor fusion are proposed. The localization of a part by spatial sensor fusion is experimentally demonstrated to be able to give required fast and accurate part localization.
Persistent Identifierhttp://hdl.handle.net/10722/212663
ISSN

 

DC FieldValueLanguage
dc.contributor.authorYu, Zhenyu-
dc.contributor.authorGhosh, Bijoy K.-
dc.contributor.authorXi, Ning-
dc.contributor.authorTarn, Tzyh Jong-
dc.date.accessioned2015-07-28T04:04:36Z-
dc.date.available2015-07-28T04:04:36Z-
dc.date.issued1995-
dc.identifier.citationProceedings - IEEE International Conference on Robotics and Automation, 1995, v. 1, p. 160-165-
dc.identifier.issn1050-4729-
dc.identifier.urihttp://hdl.handle.net/10722/212663-
dc.description.abstractIn this paper, we discuss the problem of using visual and other sensors in the manipulation of a part by a robotic manipulator in a manufacturing workcell. Our emphasis is on the part localization problem involved. We introduce a new sensor-fusion approach which fuses sensory information from different sensors at various spatial and temporal scales. Relative spatial information obtained from processing of visual information is mapped to absolute taskspace of the robot through fusing of information from an encoder. Data obtained this way can be superimposed upon data obtained from displacement based vision algorithms at coarser time scales to improve overall reliability. Tracking plans reflecting sensor fusion are proposed. The localization of a part by spatial sensor fusion is experimentally demonstrated to be able to give required fast and accurate part localization.-
dc.languageeng-
dc.relation.ispartofProceedings - IEEE International Conference on Robotics and Automation-
dc.titleTemporal and spatial sensor fusion in a robotic manufacturing workcell-
dc.typeConference_Paper-
dc.description.natureLink_to_subscribed_fulltext-
dc.identifier.doi10.1109/ROBOT.1995.525279-
dc.identifier.scopuseid_2-s2.0-0029190578-
dc.identifier.volume1-
dc.identifier.spage160-
dc.identifier.epage165-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats