File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ROBIO.2015.7418881
- Scopus: eid_2-s2.0-84964434674
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Feedback of robot states for object detection in natural language controlled robotic systems
Title | Feedback of robot states for object detection in natural language controlled robotic systems |
---|---|
Authors | |
Issue Date | 2015 |
Publisher | IEEE. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000856 |
Citation | The 2015 IEEE International Conference on Robotics and Biomimetics (IEEE-ROBIO 2015), Zhuhai, China, 6-9 December 2015. In Conference Proceedings, 2015, p. 875-880 How to Cite? |
Abstract | Controlling robots with natural language enables untrained users to interact with them more easily. A significant challenge for such systems is the mismatched visual perceptual capabilities between humans and robots. Most existing methods try to improve the perceptual ability of robots by either developing robust vision algorithms to describe and identify objects more accurately, or refining the object segmentation through human collaboration. In this paper, we present a novel method to detect and track objects, and even discover previously undetected objects (e.g. objects occluded by or stacked on other objects) by incorporating feedback of robot states into the vision module. By reasoning about the object states according to the trajectories of robot states and then re-detecting the point clouds of the objects, the representation of the environment can be efficiently and accurately updated. Experimental results demonstrate the effectiveness and advantages of the proposed method. © 2015 IEEE. |
Persistent Identifier | http://hdl.handle.net/10722/235015 |
ISBN |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Bao, J | - |
dc.contributor.author | Jia, Y | - |
dc.contributor.author | Cheng, Y | - |
dc.contributor.author | Tang, H | - |
dc.contributor.author | Xi, N | - |
dc.date.accessioned | 2016-10-14T13:50:44Z | - |
dc.date.available | 2016-10-14T13:50:44Z | - |
dc.date.issued | 2015 | - |
dc.identifier.citation | The 2015 IEEE International Conference on Robotics and Biomimetics (IEEE-ROBIO 2015), Zhuhai, China, 6-9 December 2015. In Conference Proceedings, 2015, p. 875-880 | - |
dc.identifier.isbn | 978-146739674-5 | - |
dc.identifier.uri | http://hdl.handle.net/10722/235015 | - |
dc.description.abstract | Controlling robots with natural language enables untrained users to interact with them more easily. A significant challenge for such systems is the mismatched visual perceptual capabilities between humans and robots. Most existing methods try to improve the perceptual ability of robots by either developing robust vision algorithms to describe and identify objects more accurately, or refining the object segmentation through human collaboration. In this paper, we present a novel method to detect and track objects, and even discover previously undetected objects (e.g. objects occluded by or stacked on other objects) by incorporating feedback of robot states into the vision module. By reasoning about the object states according to the trajectories of robot states and then re-detecting the point clouds of the objects, the representation of the environment can be efficiently and accurately updated. Experimental results demonstrate the effectiveness and advantages of the proposed method. © 2015 IEEE. | - |
dc.language | eng | - |
dc.publisher | IEEE. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000856 | - |
dc.relation.ispartof | IEEE International Conference on Robotics and Biomimetics Proceedings | - |
dc.rights | IEEE International Conference on Robotics and Biomimetics Proceedings. Copyright © IEEE. | - |
dc.rights | ©2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.title | Feedback of robot states for object detection in natural language controlled robotic systems | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Xi, N: xining@hku.hk | - |
dc.identifier.authority | Xi, N=rp02044 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/ROBIO.2015.7418881 | - |
dc.identifier.scopus | eid_2-s2.0-84964434674 | - |
dc.identifier.hkuros | 269347 | - |
dc.identifier.spage | 875 | - |
dc.identifier.epage | 880 | - |
dc.publisher.place | United States | - |
dc.customcontrol.immutable | sml 161019 | - |