File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Reconstructing unseen spaces in collapsed structures for search and rescue via deep learning based radargram inversion

TitleReconstructing unseen spaces in collapsed structures for search and rescue via deep learning based radargram inversion
Authors
KeywordsCollapsed structures
Deep learning
Disasters
Radar
Reconstruction
Search and rescue
Issue Date2022
Citation
Automation in Construction, 2022, v. 140, article no. 104380 How to Cite?
AbstractThis paper developed a novel deep learning-based approach to processing ground-penetrating radar (GPR) radargrams to reconstruct the occluded interior spaces of collapsed structures and extract essential information such as survivable void spaces to assist search-and-rescue operations. The proposed method innovatively exploits a generative adversarial network (GAN) to augment synthetic GPR training data and an end-to-end deep learning model to invert a GPR radargram to a permittivity map of the cross-section that can be further interpreted to reconstruct the interior scenarios of collapsed structures. First, to address the lack of training data with correct labels, synthetic GPR radargrams were generated from simulated scenarios of collapsed structures. The GAN was applied to augment the realism of synthetic GPR radargrams for training, providing a new mechanism for preparing and augmenting data that is difficult to collect from a real disaster site. Second, instead of detecting and segmenting nonintuitive features in GPR radargrams, a new encoder-decoder structure was trained using the augmented GPR radargrams to directly reconstruct permittivity maps corresponding to the cross-sections of collapsed structures. The visual Turing test indicated that the GAN substantially improved the realism of synthetic GPR radargrams. The proposed GPR inversion method achieved an R2 value of 0.93, a Mean Absolute Error (MAE) of 0.73, and a Structural Similarity Index Measure (SSIM) of 0.95 in inferring the material permittivity from synthetic radargrams. The predicted permittivity map was further used to identify void spaces, achieving an F1 score of 64.34%, a precision of 63.06%, and a recall of 71.84% at the pixel level. On the augmented radargrams, the network achieved an R2 value of 0.76, an MAE of 1.49, and an SSIM of 0.89 in predicting the permittivity map. The void detection on augmented radargrams achieved an F1 score of 45.20%, a precision of 45.35%, and a recall of 53.99% at the pixel level. The feasibility of using the proposed inversion network to reconstruct permittivity maps from radargrams is also experimentally tested and demonstrated in simulations of two multistory building collapses.
Persistent Identifierhttp://hdl.handle.net/10722/320858
ISSN
2021 Impact Factor: 10.517
2020 SCImago Journal Rankings: 1.837
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorHu, D-
dc.contributor.authorChen, J-
dc.contributor.authorLi, S-
dc.date.accessioned2022-11-01T04:42:35Z-
dc.date.available2022-11-01T04:42:35Z-
dc.date.issued2022-
dc.identifier.citationAutomation in Construction, 2022, v. 140, article no. 104380-
dc.identifier.issn0926-5805-
dc.identifier.urihttp://hdl.handle.net/10722/320858-
dc.description.abstractThis paper developed a novel deep learning-based approach to processing ground-penetrating radar (GPR) radargrams to reconstruct the occluded interior spaces of collapsed structures and extract essential information such as survivable void spaces to assist search-and-rescue operations. The proposed method innovatively exploits a generative adversarial network (GAN) to augment synthetic GPR training data and an end-to-end deep learning model to invert a GPR radargram to a permittivity map of the cross-section that can be further interpreted to reconstruct the interior scenarios of collapsed structures. First, to address the lack of training data with correct labels, synthetic GPR radargrams were generated from simulated scenarios of collapsed structures. The GAN was applied to augment the realism of synthetic GPR radargrams for training, providing a new mechanism for preparing and augmenting data that is difficult to collect from a real disaster site. Second, instead of detecting and segmenting nonintuitive features in GPR radargrams, a new encoder-decoder structure was trained using the augmented GPR radargrams to directly reconstruct permittivity maps corresponding to the cross-sections of collapsed structures. The visual Turing test indicated that the GAN substantially improved the realism of synthetic GPR radargrams. The proposed GPR inversion method achieved an R2 value of 0.93, a Mean Absolute Error (MAE) of 0.73, and a Structural Similarity Index Measure (SSIM) of 0.95 in inferring the material permittivity from synthetic radargrams. The predicted permittivity map was further used to identify void spaces, achieving an F1 score of 64.34%, a precision of 63.06%, and a recall of 71.84% at the pixel level. On the augmented radargrams, the network achieved an R2 value of 0.76, an MAE of 1.49, and an SSIM of 0.89 in predicting the permittivity map. The void detection on augmented radargrams achieved an F1 score of 45.20%, a precision of 45.35%, and a recall of 53.99% at the pixel level. The feasibility of using the proposed inversion network to reconstruct permittivity maps from radargrams is also experimentally tested and demonstrated in simulations of two multistory building collapses.-
dc.languageeng-
dc.relation.ispartofAutomation in Construction-
dc.subjectCollapsed structures-
dc.subjectDeep learning-
dc.subjectDisasters-
dc.subjectRadar-
dc.subjectReconstruction-
dc.subjectSearch and rescue-
dc.titleReconstructing unseen spaces in collapsed structures for search and rescue via deep learning based radargram inversion-
dc.typeArticle-
dc.identifier.emailChen, J: chenjj10@hku.hk-
dc.identifier.authorityChen, J=rp03048-
dc.identifier.doi10.1016/j.autcon.2022.104380-
dc.identifier.scopuseid_2-s2.0-85131224953-
dc.identifier.hkuros340910-
dc.identifier.volume140-
dc.identifier.spagearticle no. 104380-
dc.identifier.epagearticle no. 104380-
dc.identifier.isiWOS:000833416200001-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats