File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: FAST-LIVO: Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry
Title | FAST-LIVO: Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry |
---|---|
Authors | |
Issue Date | 26-Dec-2022 |
Abstract | To achieve accurate and robust pose estimation in Simultaneous Localization and Mapping (SLAM) task, multisensor fusion is proven to be an effective solution and thus provides great potential in robotic applications. This paper proposes FAST-LIVO, a fast LiDAR-Inertial-Visual Odometry system, which builds on two tightly-coupled and direct odometry subsystems: a VIO subsystem and a LIO subsystem. The LIO subsystem registers raw points (instead of feature points on e.g., edges or planes) of a new scan to an incrementally-built point cloud map. The map points are additionally attached with image patches, which are then used in the VIO subsystem to align a new image by minimizing the direct photometric errors without extracting any visual features (e.g., ORB or FAST corner features). To further improve the VIO robustness and accuracy, a novel outlier rejection method is proposed to reject unstable map points that lie on edges or are occluded in the image view. Experiments on both open data sequences and our customized device data are conducted. The results show our proposed system outperforms other counterparts and can handle challenging environments at reduced computation cost. The system supports both multi-line spinning LiDARs and emerging solid-state LiDARs with completely different scanning patterns, and can run in real-time on both Intel and ARM processors. We open source our code and dataset of this work on Github 2 2 https://github.com/hku-mars/FAST-LIVO to benefit the robotics community. |
Persistent Identifier | http://hdl.handle.net/10722/333731 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zheng, Chunran | - |
dc.contributor.author | Zhu, Qingyan | - |
dc.contributor.author | Xu, Wei | - |
dc.contributor.author | Liu, Xiyuan | - |
dc.contributor.author | Guo, Qizhi | - |
dc.contributor.author | Zhang, Fu | - |
dc.date.accessioned | 2023-10-06T08:38:37Z | - |
dc.date.available | 2023-10-06T08:38:37Z | - |
dc.date.issued | 2022-12-26 | - |
dc.identifier.uri | http://hdl.handle.net/10722/333731 | - |
dc.description.abstract | <p>To achieve accurate and robust pose estimation in Simultaneous Localization and Mapping (SLAM) task, multisensor fusion is proven to be an effective solution and thus provides great potential in robotic applications. This paper proposes FAST-LIVO, a fast LiDAR-Inertial-Visual Odometry system, which builds on two tightly-coupled and direct odometry subsystems: a VIO subsystem and a LIO subsystem. The LIO subsystem registers raw points (instead of feature points on e.g., edges or planes) of a new scan to an incrementally-built point cloud map. The map points are additionally attached with image patches, which are then used in the VIO subsystem to align a new image by minimizing the direct photometric errors without extracting any visual features (e.g., ORB or FAST corner features). To further improve the VIO robustness and accuracy, a novel outlier rejection method is proposed to reject unstable map points that lie on edges or are occluded in the image view. Experiments on both open data sequences and our customized device data are conducted. The results show our proposed system outperforms other counterparts and can handle challenging environments at reduced computation cost. The system supports both multi-line spinning LiDARs and emerging solid-state LiDARs with completely different scanning patterns, and can run in real-time on both Intel and ARM processors. We open source our code and dataset of this work on Github 2 2 https://github.com/hku-mars/FAST-LIVO to benefit the robotics community.<br></p> | - |
dc.language | eng | - |
dc.relation.ispartof | 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2022) (23/10/2022-27/10/2022, Kyoto) | - |
dc.title | FAST-LIVO: Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.1109/IROS47612.2022.9981107 | - |