File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: EVI-SAM: Robust, Real-Time, Tightly-Coupled Event–Visual–Inertial State Estimation and 3D Dense Mapping

TitleEVI-SAM: Robust, Real-Time, Tightly-Coupled Event–Visual–Inertial State Estimation and 3D Dense Mapping
Authors
Keywords6-DoF Pose Tracking
event cameras
event-based visions
robotics
simultaneous localization and mapping
Issue Date1-Jan-2024
PublisherWiley Open Access
Citation
Advanced Intelligent Systems, 2024 How to Cite?
AbstractEvent cameras demonstrate substantial potential in handling challenging situations, such as motion blur and high dynamic range. Herein, event–visual–inertial state estimation and 3D dense mapping (EVI-SAM) are introduced to tackle the problem of pose tracking and 3D dense reconstruction using the monocular event camera. A novel event-based hybrid tracking framework is designed to estimate the pose, leveraging the robustness of feature matching and the precision of direct alignment. Specifically, an event-based 2D–2D alignment is developed to construct the photometric constraint and tightly integrated with the event-based reprojection constraint. The mapping module recovers the dense and colorful depth of the scene through the image-guided event-based mapping method. Subsequently, the appearance, texture, and surface mesh of the 3D scene can be reconstructed by fusing the dense depth map from multiple viewpoints using truncated signed distance function fusion. To the best of knowledge, this is the first nonlearning work to realize event-based dense mapping. Numerical evaluations are performed on both publicly available datasets, which qualitatively and quantitatively demonstrate the superior performance of our method. EVI-SAM effectively balances accuracy and robustness while maintaining computational efficiency, showcasing superior pose tracking and dense mapping performance in challenging scenarios.
Persistent Identifierhttp://hdl.handle.net/10722/348558
ISSN
2023 Impact Factor: 6.8

 

DC FieldValueLanguage
dc.contributor.authorGuan, Weipeng-
dc.contributor.authorChen, Peiyu-
dc.contributor.authorZhao, Huibin-
dc.contributor.authorWang, Yu-
dc.contributor.authorLu, Peng-
dc.date.accessioned2024-10-10T00:31:34Z-
dc.date.available2024-10-10T00:31:34Z-
dc.date.issued2024-01-01-
dc.identifier.citationAdvanced Intelligent Systems, 2024-
dc.identifier.issn2640-4567-
dc.identifier.urihttp://hdl.handle.net/10722/348558-
dc.description.abstractEvent cameras demonstrate substantial potential in handling challenging situations, such as motion blur and high dynamic range. Herein, event–visual–inertial state estimation and 3D dense mapping (EVI-SAM) are introduced to tackle the problem of pose tracking and 3D dense reconstruction using the monocular event camera. A novel event-based hybrid tracking framework is designed to estimate the pose, leveraging the robustness of feature matching and the precision of direct alignment. Specifically, an event-based 2D–2D alignment is developed to construct the photometric constraint and tightly integrated with the event-based reprojection constraint. The mapping module recovers the dense and colorful depth of the scene through the image-guided event-based mapping method. Subsequently, the appearance, texture, and surface mesh of the 3D scene can be reconstructed by fusing the dense depth map from multiple viewpoints using truncated signed distance function fusion. To the best of knowledge, this is the first nonlearning work to realize event-based dense mapping. Numerical evaluations are performed on both publicly available datasets, which qualitatively and quantitatively demonstrate the superior performance of our method. EVI-SAM effectively balances accuracy and robustness while maintaining computational efficiency, showcasing superior pose tracking and dense mapping performance in challenging scenarios.-
dc.languageeng-
dc.publisherWiley Open Access-
dc.relation.ispartofAdvanced Intelligent Systems-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subject6-DoF Pose Tracking-
dc.subjectevent cameras-
dc.subjectevent-based visions-
dc.subjectrobotics-
dc.subjectsimultaneous localization and mapping-
dc.titleEVI-SAM: Robust, Real-Time, Tightly-Coupled Event–Visual–Inertial State Estimation and 3D Dense Mapping-
dc.typeArticle-
dc.identifier.doi10.1002/aisy.202400243-
dc.identifier.scopuseid_2-s2.0-85197905014-
dc.identifier.eissn2640-4567-
dc.identifier.issnl2640-4567-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats