File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: PL-EVIO: Robust Monocular Event-Based Visual Inertial Odometry With Point and Line Features

TitlePL-EVIO: Robust Monocular Event-Based Visual Inertial Odometry With Point and Line Features
Authors
Keywordsaggressive quadrotor
Cameras
Event cameras
event-based VIO
Feature extraction
Real-time systems
robotics
sensor fusion
SLAM
Standards
State estimation
Streams
Tracking
Issue Date19-Oct-2023
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Automation Science and Engineering, 2023 How to Cite?
Abstract

Robust state estimation in challenge situations is still an unsolved problem, especially achieving onboard pose feedback control for aggressive motion. In this paper, we propose robust and real-time event-based visual-inertial odometry (VIO) that incorporates event, image, and inertial measurements. Our approach utilizes line-based event features to provide additional structure and constraint information in human-made scenes, while point-based event and image features complement each other through well-designed feature management. To achieve reliable state estimation, we tightly couple the point-based and line-based visual residuals from the event camera, the point-based visual residual from the standard camera, and the residual from IMU pre-integration using a keyframe-based graph optimization framework. Experiments in the public benchmark datasets show that our method can achieve superior performance compared with the state-of-the-art image-based or event-based VIO. Furthermore, we demonstrate the effectiveness of our pipeline through onboard closed-loop quadrotor aggressive flight and large-scale outdoor experiments. Videos of the evaluations can be found on our website: https://youtu.be/KnWZ4anBMK4. Note to Practitioners —Driven by the need for real-time closed-loop control for drones under aggressive motion and broad illumination environments, many existing VIO systems fail to meet these requirements due to the inherent limitations of standard cameras. Event cameras are bio-inspired sensors that capture pixel-level illumination changes instead of the intensity image with a fixed frame rate, which can provide reliable visual perception during high-speed motions and in high dynamic range scenarios. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for robotics. However, adopting event cameras is challenging due to the event streams being composed of asynchronous events which are fundamentally different from the synchronous intensity images. Moreover, event cameras output minimal information or even noise when the relative motion between the camera and the scene is limited, such as in a still state, while standard cameras can provide rich perception information in most scenarios. In this paper, we propose a robust, high-accurate, and real-time optimization-based monocular event-based VIO framework that tightly fuses the event, image, and IMU measurement together. Owing to the well-designed framework and good feature management, our system can provide robust and reliable state estimation in challenging environments. The efficiency of our system is adequate to achieve real-time operation on platforms with limited resources, such as providing onboard pose feedback for quadrotor flights.


Persistent Identifierhttp://hdl.handle.net/10722/339579
ISSN
2023 Impact Factor: 5.9
2023 SCImago Journal Rankings: 2.144
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorGuan, Weipeng-
dc.contributor.authorChen, Peiyu-
dc.contributor.authorXie, Yuhan-
dc.contributor.authorLu, Peng-
dc.date.accessioned2024-03-11T10:37:46Z-
dc.date.available2024-03-11T10:37:46Z-
dc.date.issued2023-10-19-
dc.identifier.citationIEEE Transactions on Automation Science and Engineering, 2023-
dc.identifier.issn1545-5955-
dc.identifier.urihttp://hdl.handle.net/10722/339579-
dc.description.abstract<p>Robust state estimation in challenge situations is still an unsolved problem, especially achieving onboard pose feedback control for aggressive motion. In this paper, we propose robust and real-time event-based visual-inertial odometry (VIO) that incorporates event, image, and inertial measurements. Our approach utilizes line-based event features to provide additional structure and constraint information in human-made scenes, while point-based event and image features complement each other through well-designed feature management. To achieve reliable state estimation, we tightly couple the point-based and line-based visual residuals from the event camera, the point-based visual residual from the standard camera, and the residual from IMU pre-integration using a keyframe-based graph optimization framework. Experiments in the public benchmark datasets show that our method can achieve superior performance compared with the state-of-the-art image-based or event-based VIO. Furthermore, we demonstrate the effectiveness of our pipeline through onboard closed-loop quadrotor aggressive flight and large-scale outdoor experiments. Videos of the evaluations can be found on our website: https://youtu.be/KnWZ4anBMK4. Note to Practitioners —Driven by the need for real-time closed-loop control for drones under aggressive motion and broad illumination environments, many existing VIO systems fail to meet these requirements due to the inherent limitations of standard cameras. Event cameras are bio-inspired sensors that capture pixel-level illumination changes instead of the intensity image with a fixed frame rate, which can provide reliable visual perception during high-speed motions and in high dynamic range scenarios. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for robotics. However, adopting event cameras is challenging due to the event streams being composed of asynchronous events which are fundamentally different from the synchronous intensity images. Moreover, event cameras output minimal information or even noise when the relative motion between the camera and the scene is limited, such as in a still state, while standard cameras can provide rich perception information in most scenarios. In this paper, we propose a robust, high-accurate, and real-time optimization-based monocular event-based VIO framework that tightly fuses the event, image, and IMU measurement together. Owing to the well-designed framework and good feature management, our system can provide robust and reliable state estimation in challenging environments. The efficiency of our system is adequate to achieve real-time operation on platforms with limited resources, such as providing onboard pose feedback for quadrotor flights.<br></p>-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Transactions on Automation Science and Engineering-
dc.subjectaggressive quadrotor-
dc.subjectCameras-
dc.subjectEvent cameras-
dc.subjectevent-based VIO-
dc.subjectFeature extraction-
dc.subjectReal-time systems-
dc.subjectrobotics-
dc.subjectsensor fusion-
dc.subjectSLAM-
dc.subjectStandards-
dc.subjectState estimation-
dc.subjectStreams-
dc.subjectTracking-
dc.titlePL-EVIO: Robust Monocular Event-Based Visual Inertial Odometry With Point and Line Features-
dc.typeArticle-
dc.identifier.doi10.1109/TASE.2023.3324365-
dc.identifier.scopuseid_2-s2.0-85174800512-
dc.identifier.eissn1558-3783-
dc.identifier.isiWOS:001122429300001-
dc.identifier.issnl1545-5955-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats