File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: MonoEF: Extrinsic Parameter Free Monocular 3D Object Detection

TitleMonoEF: Extrinsic Parameter Free Monocular 3D Object Detection
Authors
Keywordsautonomous driving
camera extrinsic parameter
Monocular 3D object detection
Issue Date2022
Citation
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, v. 44, n. 12, p. 10114-10128 How to Cite?
AbstractMonocular 3D object detection is an important task in autonomous driving. It can be easily intractable where there exists ego-car pose change w.r.t. ground plane. This is common due to the slight fluctuation of road smoothness and slope. Due to the lack of insight in industrial application, existing methods on open datasets neglect the camera pose information, which inevitably results in the detector being susceptible to camera extrinsic parameters. The perturbation of objects is very popular in most autonomous driving cases for industrial products. To this end, we propose a novel method to capture camera pose to formulate the detector free from extrinsic perturbation. Specifically, the proposed framework predicts camera extrinsic parameters by detecting vanishing point and horizon change. A converter is designed to rectify perturbative features in the latent space. By doing so, our 3D detector works independent of the extrinsic parameter variations and produces accurate results in realistic cases, e.g., potholed and uneven roads, where almost all existing monocular detectors fail to handle. Experiments demonstrate our method yields the best performance compared with the other state-of-the-arts by a large margin on both KITTI 3D and nuScenes datasets.
Persistent Identifierhttp://hdl.handle.net/10722/351437
ISSN
2023 Impact Factor: 20.8
2023 SCImago Journal Rankings: 6.158

 

DC FieldValueLanguage
dc.contributor.authorZhou, Yunsong-
dc.contributor.authorHe, Yuan-
dc.contributor.authorZhu, Hongzi-
dc.contributor.authorWang, Cheng-
dc.contributor.authorLi, Hongyang-
dc.contributor.authorJiang, Qinhong-
dc.date.accessioned2024-11-20T03:56:16Z-
dc.date.available2024-11-20T03:56:16Z-
dc.date.issued2022-
dc.identifier.citationIEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, v. 44, n. 12, p. 10114-10128-
dc.identifier.issn0162-8828-
dc.identifier.urihttp://hdl.handle.net/10722/351437-
dc.description.abstractMonocular 3D object detection is an important task in autonomous driving. It can be easily intractable where there exists ego-car pose change w.r.t. ground plane. This is common due to the slight fluctuation of road smoothness and slope. Due to the lack of insight in industrial application, existing methods on open datasets neglect the camera pose information, which inevitably results in the detector being susceptible to camera extrinsic parameters. The perturbation of objects is very popular in most autonomous driving cases for industrial products. To this end, we propose a novel method to capture camera pose to formulate the detector free from extrinsic perturbation. Specifically, the proposed framework predicts camera extrinsic parameters by detecting vanishing point and horizon change. A converter is designed to rectify perturbative features in the latent space. By doing so, our 3D detector works independent of the extrinsic parameter variations and produces accurate results in realistic cases, e.g., potholed and uneven roads, where almost all existing monocular detectors fail to handle. Experiments demonstrate our method yields the best performance compared with the other state-of-the-arts by a large margin on both KITTI 3D and nuScenes datasets.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Pattern Analysis and Machine Intelligence-
dc.subjectautonomous driving-
dc.subjectcamera extrinsic parameter-
dc.subjectMonocular 3D object detection-
dc.titleMonoEF: Extrinsic Parameter Free Monocular 3D Object Detection-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TPAMI.2021.3136899-
dc.identifier.pmid34932471-
dc.identifier.scopuseid_2-s2.0-85122059240-
dc.identifier.volume44-
dc.identifier.issue12-
dc.identifier.spage10114-
dc.identifier.epage10128-
dc.identifier.eissn1939-3539-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats