File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Proximity based automatic data annotation for autonomous driving

TitleProximity based automatic data annotation for autonomous driving
Authors
Issue Date2020
Citation
IEEE/CAA Journal of Automatica Sinica, 2020, v. 7, n. 2, p. 395-404 How to Cite?
AbstractThe recent development in autonomous driving involves high-level computer vision and detailed road scene understanding. Today, most autonomous vehicles employ expensive high quality sensor-set such as light detection and ranging LIDAR and HD maps with high level annotations. In this paper, we propose a scalable and affordable data collection and annotation framework, image-To-map annotation proximity I2MAP , for affordance learning in autonomous driving applications. We provide a new driving dataset using our proposed framework for driving scene affordance learning by calibrating the data samples with available tags from online database such as open street map OSM . Our benchmark consists of 40 000 images with more than 40 affordance labels under various day time and weather even with very challenging heavy snow. We implemented sample advanced driver-Assistance systems ADAS functions by training our data with neural networks NN and cross-validate the results on benchmarks like KITTI and BDD100K, which indicate the effectiveness of our framework and training models.
Persistent Identifierhttp://hdl.handle.net/10722/352989
ISSN
2023 Impact Factor: 15.3
2023 SCImago Journal Rankings: 4.696
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorSun, Chen-
dc.contributor.authorVianney, Jean M.Uwabeza-
dc.contributor.authorLi, Ying-
dc.contributor.authorChen, Long-
dc.contributor.authorLi, Li-
dc.contributor.authorWang, Fei Yue-
dc.contributor.authorKhajepour, Amir-
dc.contributor.authorCao, Dongpu-
dc.date.accessioned2025-01-13T03:01:29Z-
dc.date.available2025-01-13T03:01:29Z-
dc.date.issued2020-
dc.identifier.citationIEEE/CAA Journal of Automatica Sinica, 2020, v. 7, n. 2, p. 395-404-
dc.identifier.issn2329-9266-
dc.identifier.urihttp://hdl.handle.net/10722/352989-
dc.description.abstractThe recent development in autonomous driving involves high-level computer vision and detailed road scene understanding. Today, most autonomous vehicles employ expensive high quality sensor-set such as light detection and ranging LIDAR and HD maps with high level annotations. In this paper, we propose a scalable and affordable data collection and annotation framework, image-To-map annotation proximity I2MAP , for affordance learning in autonomous driving applications. We provide a new driving dataset using our proposed framework for driving scene affordance learning by calibrating the data samples with available tags from online database such as open street map OSM . Our benchmark consists of 40 000 images with more than 40 affordance labels under various day time and weather even with very challenging heavy snow. We implemented sample advanced driver-Assistance systems ADAS functions by training our data with neural networks NN and cross-validate the results on benchmarks like KITTI and BDD100K, which indicate the effectiveness of our framework and training models.-
dc.languageeng-
dc.relation.ispartofIEEE/CAA Journal of Automatica Sinica-
dc.titleProximity based automatic data annotation for autonomous driving-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/JAS.2020.1003033-
dc.identifier.scopuseid_2-s2.0-85081545707-
dc.identifier.volume7-
dc.identifier.issue2-
dc.identifier.spage395-
dc.identifier.epage404-
dc.identifier.eissn2329-9274-
dc.identifier.isiWOS:000519596200007-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats