File Download
Supplementary

postgraduate thesis: Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments

TitleJoint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments
Authors
Advisors
Advisor(s):Zhang, FLam, J
Issue Date2024
PublisherThe University of Hong Kong (Pokfulam, Hong Kong)
Citation
Li, L. [李梁]. (2024). Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.
AbstractIn the realm of robotic perception, LiDAR-camera fusion has gained increasing research attention in recent years. Essential to this fusion process is the precise calibration of the intrinsic and extrinsic parameters of the sensor system. Although traditional target-based calibration methods are known for their accuracy, the dynamic nature of real-world applications often necessitates on-site, target-free calibration solutions. Despite its critical importance, targetless calibration for LiDAR-camera systems remains a formidable challenge, typically resulting in larger errors when compared to target-based methods. This underscores an ongoing need within the research community for the development of innovative and more efficient calibration techniques to surmount these obstacles. In this thesis, a novel targetless method is proposed for joint intrinsic and extrinsic calibration of LiDAR-camera systems. The method leverages LiDAR point cloud measurements from planes in the scene, alongside visual points derived from those planes. The core novelty of the method lies in the integration of visual bundle adjustment with the registration between visual points and LiDAR point cloud planes, which is formulated as a plane-constrained bundle adjustment problem. This unified formulation achieves concurrent intrinsic and extrinsic calibration, while also imparting depth constraints to the visual points to enhance the accuracy of intrinsic calibration. Experiments are conducted on both self-collected dataset and public data sequences. The results showcase that the approach not only surpasses other state-of-the-art methods, but also maintains remarkable calibration accuracy even within challenging environments. For the benefits of the robotics community, the calibration software is open-sourced.
DegreeMaster of Philosophy
SubjectOptical radar
Dept/ProgramMechanical Engineering
Persistent Identifierhttp://hdl.handle.net/10722/342887

 

DC FieldValueLanguage
dc.contributor.advisorZhang, F-
dc.contributor.advisorLam, J-
dc.contributor.authorLi, Liang-
dc.contributor.author李梁-
dc.date.accessioned2024-05-07T01:22:10Z-
dc.date.available2024-05-07T01:22:10Z-
dc.date.issued2024-
dc.identifier.citationLi, L. [李梁]. (2024). Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.-
dc.identifier.urihttp://hdl.handle.net/10722/342887-
dc.description.abstractIn the realm of robotic perception, LiDAR-camera fusion has gained increasing research attention in recent years. Essential to this fusion process is the precise calibration of the intrinsic and extrinsic parameters of the sensor system. Although traditional target-based calibration methods are known for their accuracy, the dynamic nature of real-world applications often necessitates on-site, target-free calibration solutions. Despite its critical importance, targetless calibration for LiDAR-camera systems remains a formidable challenge, typically resulting in larger errors when compared to target-based methods. This underscores an ongoing need within the research community for the development of innovative and more efficient calibration techniques to surmount these obstacles. In this thesis, a novel targetless method is proposed for joint intrinsic and extrinsic calibration of LiDAR-camera systems. The method leverages LiDAR point cloud measurements from planes in the scene, alongside visual points derived from those planes. The core novelty of the method lies in the integration of visual bundle adjustment with the registration between visual points and LiDAR point cloud planes, which is formulated as a plane-constrained bundle adjustment problem. This unified formulation achieves concurrent intrinsic and extrinsic calibration, while also imparting depth constraints to the visual points to enhance the accuracy of intrinsic calibration. Experiments are conducted on both self-collected dataset and public data sequences. The results showcase that the approach not only surpasses other state-of-the-art methods, but also maintains remarkable calibration accuracy even within challenging environments. For the benefits of the robotics community, the calibration software is open-sourced.-
dc.languageeng-
dc.publisherThe University of Hong Kong (Pokfulam, Hong Kong)-
dc.relation.ispartofHKU Theses Online (HKUTO)-
dc.rightsThe author retains all proprietary rights, (such as patent rights) and the right to use in future works.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subject.lcshOptical radar-
dc.titleJoint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments-
dc.typePG_Thesis-
dc.description.thesisnameMaster of Philosophy-
dc.description.thesislevelMaster-
dc.description.thesisdisciplineMechanical Engineering-
dc.description.naturepublished_or_final_version-
dc.date.hkucongregation2024-
dc.identifier.mmsid991044791816403414-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats