File Download
Supplementary
-
Citations:
- Appears in Collections:
postgraduate thesis: Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments
Title | Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments |
---|---|
Authors | |
Advisors | |
Issue Date | 2024 |
Publisher | The University of Hong Kong (Pokfulam, Hong Kong) |
Citation | Li, L. [李梁]. (2024). Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. |
Abstract | In the realm of robotic perception, LiDAR-camera fusion has gained increasing research attention in recent years. Essential to this fusion process is the precise calibration of the intrinsic and extrinsic parameters of the sensor system. Although traditional target-based calibration methods are known for their accuracy, the dynamic nature of real-world applications often necessitates on-site, target-free calibration solutions. Despite its critical importance, targetless calibration for LiDAR-camera systems remains a formidable challenge, typically resulting in larger errors when compared to target-based methods. This underscores an ongoing need within the research community for the development of innovative and more efficient calibration techniques to surmount these obstacles.
In this thesis, a novel targetless method is proposed for joint intrinsic and extrinsic calibration of LiDAR-camera systems. The method leverages LiDAR point cloud measurements from planes in the scene, alongside visual points derived from those planes. The core novelty of the method lies in the integration of visual bundle adjustment with the registration between visual points and LiDAR point cloud planes, which is formulated as a plane-constrained bundle adjustment problem. This unified formulation achieves concurrent intrinsic and extrinsic calibration, while also imparting depth constraints to the visual points to enhance the accuracy of intrinsic calibration. Experiments are conducted on both self-collected dataset and public data sequences. The results showcase that the approach not only surpasses other state-of-the-art methods, but also maintains remarkable calibration accuracy even within challenging environments. For the benefits of the robotics community, the calibration software is open-sourced. |
Degree | Master of Philosophy |
Subject | Optical radar |
Dept/Program | Mechanical Engineering |
Persistent Identifier | http://hdl.handle.net/10722/342887 |
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Zhang, F | - |
dc.contributor.advisor | Lam, J | - |
dc.contributor.author | Li, Liang | - |
dc.contributor.author | 李梁 | - |
dc.date.accessioned | 2024-05-07T01:22:10Z | - |
dc.date.available | 2024-05-07T01:22:10Z | - |
dc.date.issued | 2024 | - |
dc.identifier.citation | Li, L. [李梁]. (2024). Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. | - |
dc.identifier.uri | http://hdl.handle.net/10722/342887 | - |
dc.description.abstract | In the realm of robotic perception, LiDAR-camera fusion has gained increasing research attention in recent years. Essential to this fusion process is the precise calibration of the intrinsic and extrinsic parameters of the sensor system. Although traditional target-based calibration methods are known for their accuracy, the dynamic nature of real-world applications often necessitates on-site, target-free calibration solutions. Despite its critical importance, targetless calibration for LiDAR-camera systems remains a formidable challenge, typically resulting in larger errors when compared to target-based methods. This underscores an ongoing need within the research community for the development of innovative and more efficient calibration techniques to surmount these obstacles. In this thesis, a novel targetless method is proposed for joint intrinsic and extrinsic calibration of LiDAR-camera systems. The method leverages LiDAR point cloud measurements from planes in the scene, alongside visual points derived from those planes. The core novelty of the method lies in the integration of visual bundle adjustment with the registration between visual points and LiDAR point cloud planes, which is formulated as a plane-constrained bundle adjustment problem. This unified formulation achieves concurrent intrinsic and extrinsic calibration, while also imparting depth constraints to the visual points to enhance the accuracy of intrinsic calibration. Experiments are conducted on both self-collected dataset and public data sequences. The results showcase that the approach not only surpasses other state-of-the-art methods, but also maintains remarkable calibration accuracy even within challenging environments. For the benefits of the robotics community, the calibration software is open-sourced. | - |
dc.language | eng | - |
dc.publisher | The University of Hong Kong (Pokfulam, Hong Kong) | - |
dc.relation.ispartof | HKU Theses Online (HKUTO) | - |
dc.rights | The author retains all proprietary rights, (such as patent rights) and the right to use in future works. | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject.lcsh | Optical radar | - |
dc.title | Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera system in targetless environments | - |
dc.type | PG_Thesis | - |
dc.description.thesisname | Master of Philosophy | - |
dc.description.thesislevel | Master | - |
dc.description.thesisdiscipline | Mechanical Engineering | - |
dc.description.nature | published_or_final_version | - |
dc.date.hkucongregation | 2024 | - |
dc.identifier.mmsid | 991044791816403414 | - |