File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.autcon.2022.104380
- Scopus: eid_2-s2.0-85131224953
- WOS: WOS:000833416200001
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Reconstructing unseen spaces in collapsed structures for search and rescue via deep learning based radargram inversion
Title | Reconstructing unseen spaces in collapsed structures for search and rescue via deep learning based radargram inversion |
---|---|
Authors | |
Keywords | Collapsed structures Deep learning Disasters Radar Reconstruction Search and rescue |
Issue Date | 2022 |
Citation | Automation in Construction, 2022, v. 140, article no. 104380 How to Cite? |
Abstract | This paper developed a novel deep learning-based approach to processing ground-penetrating radar (GPR) radargrams to reconstruct the occluded interior spaces of collapsed structures and extract essential information such as survivable void spaces to assist search-and-rescue operations. The proposed method innovatively exploits a generative adversarial network (GAN) to augment synthetic GPR training data and an end-to-end deep learning model to invert a GPR radargram to a permittivity map of the cross-section that can be further interpreted to reconstruct the interior scenarios of collapsed structures. First, to address the lack of training data with correct labels, synthetic GPR radargrams were generated from simulated scenarios of collapsed structures. The GAN was applied to augment the realism of synthetic GPR radargrams for training, providing a new mechanism for preparing and augmenting data that is difficult to collect from a real disaster site. Second, instead of detecting and segmenting nonintuitive features in GPR radargrams, a new encoder-decoder structure was trained using the augmented GPR radargrams to directly reconstruct permittivity maps corresponding to the cross-sections of collapsed structures. The visual Turing test indicated that the GAN substantially improved the realism of synthetic GPR radargrams. The proposed GPR inversion method achieved an R2 value of 0.93, a Mean Absolute Error (MAE) of 0.73, and a Structural Similarity Index Measure (SSIM) of 0.95 in inferring the material permittivity from synthetic radargrams. The predicted permittivity map was further used to identify void spaces, achieving an F1 score of 64.34%, a precision of 63.06%, and a recall of 71.84% at the pixel level. On the augmented radargrams, the network achieved an R2 value of 0.76, an MAE of 1.49, and an SSIM of 0.89 in predicting the permittivity map. The void detection on augmented radargrams achieved an F1 score of 45.20%, a precision of 45.35%, and a recall of 53.99% at the pixel level. The feasibility of using the proposed inversion network to reconstruct permittivity maps from radargrams is also experimentally tested and demonstrated in simulations of two multistory building collapses. |
Persistent Identifier | http://hdl.handle.net/10722/320858 |
ISSN | 2023 Impact Factor: 9.6 2023 SCImago Journal Rankings: 2.626 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hu, D | - |
dc.contributor.author | Chen, J | - |
dc.contributor.author | Li, S | - |
dc.date.accessioned | 2022-11-01T04:42:35Z | - |
dc.date.available | 2022-11-01T04:42:35Z | - |
dc.date.issued | 2022 | - |
dc.identifier.citation | Automation in Construction, 2022, v. 140, article no. 104380 | - |
dc.identifier.issn | 0926-5805 | - |
dc.identifier.uri | http://hdl.handle.net/10722/320858 | - |
dc.description.abstract | This paper developed a novel deep learning-based approach to processing ground-penetrating radar (GPR) radargrams to reconstruct the occluded interior spaces of collapsed structures and extract essential information such as survivable void spaces to assist search-and-rescue operations. The proposed method innovatively exploits a generative adversarial network (GAN) to augment synthetic GPR training data and an end-to-end deep learning model to invert a GPR radargram to a permittivity map of the cross-section that can be further interpreted to reconstruct the interior scenarios of collapsed structures. First, to address the lack of training data with correct labels, synthetic GPR radargrams were generated from simulated scenarios of collapsed structures. The GAN was applied to augment the realism of synthetic GPR radargrams for training, providing a new mechanism for preparing and augmenting data that is difficult to collect from a real disaster site. Second, instead of detecting and segmenting nonintuitive features in GPR radargrams, a new encoder-decoder structure was trained using the augmented GPR radargrams to directly reconstruct permittivity maps corresponding to the cross-sections of collapsed structures. The visual Turing test indicated that the GAN substantially improved the realism of synthetic GPR radargrams. The proposed GPR inversion method achieved an R2 value of 0.93, a Mean Absolute Error (MAE) of 0.73, and a Structural Similarity Index Measure (SSIM) of 0.95 in inferring the material permittivity from synthetic radargrams. The predicted permittivity map was further used to identify void spaces, achieving an F1 score of 64.34%, a precision of 63.06%, and a recall of 71.84% at the pixel level. On the augmented radargrams, the network achieved an R2 value of 0.76, an MAE of 1.49, and an SSIM of 0.89 in predicting the permittivity map. The void detection on augmented radargrams achieved an F1 score of 45.20%, a precision of 45.35%, and a recall of 53.99% at the pixel level. The feasibility of using the proposed inversion network to reconstruct permittivity maps from radargrams is also experimentally tested and demonstrated in simulations of two multistory building collapses. | - |
dc.language | eng | - |
dc.relation.ispartof | Automation in Construction | - |
dc.subject | Collapsed structures | - |
dc.subject | Deep learning | - |
dc.subject | Disasters | - |
dc.subject | Radar | - |
dc.subject | Reconstruction | - |
dc.subject | Search and rescue | - |
dc.title | Reconstructing unseen spaces in collapsed structures for search and rescue via deep learning based radargram inversion | - |
dc.type | Article | - |
dc.identifier.email | Chen, J: chenjj10@hku.hk | - |
dc.identifier.authority | Chen, J=rp03048 | - |
dc.identifier.doi | 10.1016/j.autcon.2022.104380 | - |
dc.identifier.scopus | eid_2-s2.0-85131224953 | - |
dc.identifier.hkuros | 340910 | - |
dc.identifier.volume | 140 | - |
dc.identifier.spage | article no. 104380 | - |
dc.identifier.epage | article no. 104380 | - |
dc.identifier.isi | WOS:000833416200001 | - |