File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/LRA.2019.2893432
- Scopus: eid_2-s2.0-85063310810
- WOS: WOS:000458182000007
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Safe Navigation With Human Instructions in Complex Scenes
Title | Safe Navigation With Human Instructions in Complex Scenes |
---|---|
Authors | |
Keywords | Navigation Grounding Semantics Task analysis Robot kinematics |
Issue Date | 2019 |
Publisher | Institute of Electrical and Electronics Engineers. The Journal's web site is located at https://www.ieee.org/membership-catalog/productdetail/showProductDetailPage.html?product=PER481-ELE |
Citation | IEEE Robotics and Automation Letters, 2019, v. 4 n. 2, p. 753-760 How to Cite? |
Abstract | In this letter, we present a robotic navigation algorithm with natural language interfaces that enables a robot to safely walk through a changing environment with moving persons by following human instructions such as “go to the restaurant and keep away from people.” We first classify human instructions into three types: goal, constraints, and uninformative phrases. Next, we provide grounding in a dynamic manner for the extracted goal and constraint items along with the navigation process to deal with target objects that are too far away for sensor observation and the appearance of moving obstacles such as humans. In particular, for a goal phrase (e.g., “go to the restaurant”), we ground it to a location in a predefined semantic map and treat it as a goal for a global motion planner, which plans a collision-free path in the workspace for the robot to follow. For a constraint phrase (e.g., “keep away from people”), we dynamically add the corresponding constraint into a local planner by adjusting the values of a local costmap according to the results returned by the object detection module. The updated costmap is then used to compute a local collision avoidance control for the safe navigation of the robot. By combining natural language processing, motion planning, and computer vision, our developed system can successfully follow natural language navigation instructions to achieve navigation tasks in both simulated and real-world scenarios. Videos are available at https://sites.google.com/view/snhi. |
Persistent Identifier | http://hdl.handle.net/10722/273148 |
ISSN | 2023 Impact Factor: 4.6 2023 SCImago Journal Rankings: 2.119 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hu, Z | - |
dc.contributor.author | Pan, J | - |
dc.contributor.author | Fan, T | - |
dc.contributor.author | Yang, R | - |
dc.contributor.author | Manocha, D | - |
dc.date.accessioned | 2019-08-06T09:23:25Z | - |
dc.date.available | 2019-08-06T09:23:25Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | IEEE Robotics and Automation Letters, 2019, v. 4 n. 2, p. 753-760 | - |
dc.identifier.issn | 2377-3766 | - |
dc.identifier.uri | http://hdl.handle.net/10722/273148 | - |
dc.description.abstract | In this letter, we present a robotic navigation algorithm with natural language interfaces that enables a robot to safely walk through a changing environment with moving persons by following human instructions such as “go to the restaurant and keep away from people.” We first classify human instructions into three types: goal, constraints, and uninformative phrases. Next, we provide grounding in a dynamic manner for the extracted goal and constraint items along with the navigation process to deal with target objects that are too far away for sensor observation and the appearance of moving obstacles such as humans. In particular, for a goal phrase (e.g., “go to the restaurant”), we ground it to a location in a predefined semantic map and treat it as a goal for a global motion planner, which plans a collision-free path in the workspace for the robot to follow. For a constraint phrase (e.g., “keep away from people”), we dynamically add the corresponding constraint into a local planner by adjusting the values of a local costmap according to the results returned by the object detection module. The updated costmap is then used to compute a local collision avoidance control for the safe navigation of the robot. By combining natural language processing, motion planning, and computer vision, our developed system can successfully follow natural language navigation instructions to achieve navigation tasks in both simulated and real-world scenarios. Videos are available at https://sites.google.com/view/snhi. | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers. The Journal's web site is located at https://www.ieee.org/membership-catalog/productdetail/showProductDetailPage.html?product=PER481-ELE | - |
dc.relation.ispartof | IEEE Robotics and Automation Letters | - |
dc.rights | IEEE Robotics and Automation Letters. Copyright © Institute of Electrical and Electronics Engineers. | - |
dc.rights | ©20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.subject | Navigation | - |
dc.subject | Grounding | - |
dc.subject | Semantics | - |
dc.subject | Task analysis | - |
dc.subject | Robot kinematics | - |
dc.title | Safe Navigation With Human Instructions in Complex Scenes | - |
dc.type | Article | - |
dc.identifier.email | Pan, J: jpan@cs.hku.hk | - |
dc.identifier.authority | Pan, J=rp01984 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/LRA.2019.2893432 | - |
dc.identifier.scopus | eid_2-s2.0-85063310810 | - |
dc.identifier.hkuros | 300342 | - |
dc.identifier.volume | 4 | - |
dc.identifier.issue | 2 | - |
dc.identifier.spage | 753 | - |
dc.identifier.epage | 760 | - |
dc.identifier.isi | WOS:000458182000007 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 2377-3766 | - |