File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TASE.2023.3324365
- Scopus: eid_2-s2.0-85174800512
- WOS: WOS:001122429300001
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: PL-EVIO: Robust Monocular Event-Based Visual Inertial Odometry With Point and Line Features
Title | PL-EVIO: Robust Monocular Event-Based Visual Inertial Odometry With Point and Line Features |
---|---|
Authors | |
Keywords | aggressive quadrotor Cameras Event cameras event-based VIO Feature extraction Real-time systems robotics sensor fusion SLAM Standards State estimation Streams Tracking |
Issue Date | 19-Oct-2023 |
Publisher | Institute of Electrical and Electronics Engineers |
Citation | IEEE Transactions on Automation Science and Engineering, 2023 How to Cite? |
Abstract | Robust state estimation in challenge situations is still an unsolved problem, especially achieving onboard pose feedback control for aggressive motion. In this paper, we propose robust and real-time event-based visual-inertial odometry (VIO) that incorporates event, image, and inertial measurements. Our approach utilizes line-based event features to provide additional structure and constraint information in human-made scenes, while point-based event and image features complement each other through well-designed feature management. To achieve reliable state estimation, we tightly couple the point-based and line-based visual residuals from the event camera, the point-based visual residual from the standard camera, and the residual from IMU pre-integration using a keyframe-based graph optimization framework. Experiments in the public benchmark datasets show that our method can achieve superior performance compared with the state-of-the-art image-based or event-based VIO. Furthermore, we demonstrate the effectiveness of our pipeline through onboard closed-loop quadrotor aggressive flight and large-scale outdoor experiments. Videos of the evaluations can be found on our website: https://youtu.be/KnWZ4anBMK4. Note to Practitioners —Driven by the need for real-time closed-loop control for drones under aggressive motion and broad illumination environments, many existing VIO systems fail to meet these requirements due to the inherent limitations of standard cameras. Event cameras are bio-inspired sensors that capture pixel-level illumination changes instead of the intensity image with a fixed frame rate, which can provide reliable visual perception during high-speed motions and in high dynamic range scenarios. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for robotics. However, adopting event cameras is challenging due to the event streams being composed of asynchronous events which are fundamentally different from the synchronous intensity images. Moreover, event cameras output minimal information or even noise when the relative motion between the camera and the scene is limited, such as in a still state, while standard cameras can provide rich perception information in most scenarios. In this paper, we propose a robust, high-accurate, and real-time optimization-based monocular event-based VIO framework that tightly fuses the event, image, and IMU measurement together. Owing to the well-designed framework and good feature management, our system can provide robust and reliable state estimation in challenging environments. The efficiency of our system is adequate to achieve real-time operation on platforms with limited resources, such as providing onboard pose feedback for quadrotor flights. |
Persistent Identifier | http://hdl.handle.net/10722/339579 |
ISSN | 2023 Impact Factor: 5.9 2023 SCImago Journal Rankings: 2.144 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Guan, Weipeng | - |
dc.contributor.author | Chen, Peiyu | - |
dc.contributor.author | Xie, Yuhan | - |
dc.contributor.author | Lu, Peng | - |
dc.date.accessioned | 2024-03-11T10:37:46Z | - |
dc.date.available | 2024-03-11T10:37:46Z | - |
dc.date.issued | 2023-10-19 | - |
dc.identifier.citation | IEEE Transactions on Automation Science and Engineering, 2023 | - |
dc.identifier.issn | 1545-5955 | - |
dc.identifier.uri | http://hdl.handle.net/10722/339579 | - |
dc.description.abstract | <p>Robust state estimation in challenge situations is still an unsolved problem, especially achieving onboard pose feedback control for aggressive motion. In this paper, we propose robust and real-time event-based visual-inertial odometry (VIO) that incorporates event, image, and inertial measurements. Our approach utilizes line-based event features to provide additional structure and constraint information in human-made scenes, while point-based event and image features complement each other through well-designed feature management. To achieve reliable state estimation, we tightly couple the point-based and line-based visual residuals from the event camera, the point-based visual residual from the standard camera, and the residual from IMU pre-integration using a keyframe-based graph optimization framework. Experiments in the public benchmark datasets show that our method can achieve superior performance compared with the state-of-the-art image-based or event-based VIO. Furthermore, we demonstrate the effectiveness of our pipeline through onboard closed-loop quadrotor aggressive flight and large-scale outdoor experiments. Videos of the evaluations can be found on our website: https://youtu.be/KnWZ4anBMK4. Note to Practitioners —Driven by the need for real-time closed-loop control for drones under aggressive motion and broad illumination environments, many existing VIO systems fail to meet these requirements due to the inherent limitations of standard cameras. Event cameras are bio-inspired sensors that capture pixel-level illumination changes instead of the intensity image with a fixed frame rate, which can provide reliable visual perception during high-speed motions and in high dynamic range scenarios. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for robotics. However, adopting event cameras is challenging due to the event streams being composed of asynchronous events which are fundamentally different from the synchronous intensity images. Moreover, event cameras output minimal information or even noise when the relative motion between the camera and the scene is limited, such as in a still state, while standard cameras can provide rich perception information in most scenarios. In this paper, we propose a robust, high-accurate, and real-time optimization-based monocular event-based VIO framework that tightly fuses the event, image, and IMU measurement together. Owing to the well-designed framework and good feature management, our system can provide robust and reliable state estimation in challenging environments. The efficiency of our system is adequate to achieve real-time operation on platforms with limited resources, such as providing onboard pose feedback for quadrotor flights.<br></p> | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers | - |
dc.relation.ispartof | IEEE Transactions on Automation Science and Engineering | - |
dc.subject | aggressive quadrotor | - |
dc.subject | Cameras | - |
dc.subject | Event cameras | - |
dc.subject | event-based VIO | - |
dc.subject | Feature extraction | - |
dc.subject | Real-time systems | - |
dc.subject | robotics | - |
dc.subject | sensor fusion | - |
dc.subject | SLAM | - |
dc.subject | Standards | - |
dc.subject | State estimation | - |
dc.subject | Streams | - |
dc.subject | Tracking | - |
dc.title | PL-EVIO: Robust Monocular Event-Based Visual Inertial Odometry With Point and Line Features | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/TASE.2023.3324365 | - |
dc.identifier.scopus | eid_2-s2.0-85174800512 | - |
dc.identifier.eissn | 1558-3783 | - |
dc.identifier.isi | WOS:001122429300001 | - |
dc.identifier.issnl | 1545-5955 | - |