File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1007/978-3-030-00919-9_17
- Scopus: eid_2-s2.0-85054536895
- WOS: WOS:000477767800017
- Find via

Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Semantic-aware generative adversarial nets for unsupervised domain adaptation in chest X-ray segmentation
| Title | Semantic-aware generative adversarial nets for unsupervised domain adaptation in chest X-ray segmentation |
|---|---|
| Authors | |
| Issue Date | 2018 |
| Citation | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2018, v. 11046 LNCS, p. 143-151 How to Cite? |
| Abstract | In spite of the compelling achievements that deep neural networks (DNNs) have made in medical image computing, these deep models often suffer from degraded performance when being applied to new test datasets with domain shift. In this paper, we present a novel unsupervised domain adaptation approach for segmentation tasks by designing semantic-aware generative adversarial networks (GANs). Specifically, we transform the test image into the appearance of source domain, with the semantic structural information being well preserved, which is achieved by imposing a nested adversarial learning in semantic label space. In this way, the segmentation DNN learned from the source domain is able to be directly generalized to the transformed test image, eliminating the need of training a new model for every new target dataset. Our domain adaptation procedure is unsupervised, without using any target domain labels. The adversarial learning of our network is guided by a GAN loss for mapping data distributions, a cycle-consistency loss for retaining pixel-level content, and a semantic-aware loss for enhancing structural information. We validated our method on two different chest X-ray public datasets for left/right lung segmentation. Experimental results show that the segmentation performance of our unsupervised approach is highly competitive with the upper bound of supervised transfer learning. |
| Persistent Identifier | http://hdl.handle.net/10722/349279 |
| ISSN | 2023 SCImago Journal Rankings: 0.606 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Chen, Cheng | - |
| dc.contributor.author | Dou, Qi | - |
| dc.contributor.author | Chen, Hao | - |
| dc.contributor.author | Heng, Pheng Ann | - |
| dc.date.accessioned | 2024-10-17T06:57:29Z | - |
| dc.date.available | 2024-10-17T06:57:29Z | - |
| dc.date.issued | 2018 | - |
| dc.identifier.citation | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2018, v. 11046 LNCS, p. 143-151 | - |
| dc.identifier.issn | 0302-9743 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/349279 | - |
| dc.description.abstract | In spite of the compelling achievements that deep neural networks (DNNs) have made in medical image computing, these deep models often suffer from degraded performance when being applied to new test datasets with domain shift. In this paper, we present a novel unsupervised domain adaptation approach for segmentation tasks by designing semantic-aware generative adversarial networks (GANs). Specifically, we transform the test image into the appearance of source domain, with the semantic structural information being well preserved, which is achieved by imposing a nested adversarial learning in semantic label space. In this way, the segmentation DNN learned from the source domain is able to be directly generalized to the transformed test image, eliminating the need of training a new model for every new target dataset. Our domain adaptation procedure is unsupervised, without using any target domain labels. The adversarial learning of our network is guided by a GAN loss for mapping data distributions, a cycle-consistency loss for retaining pixel-level content, and a semantic-aware loss for enhancing structural information. We validated our method on two different chest X-ray public datasets for left/right lung segmentation. Experimental results show that the segmentation performance of our unsupervised approach is highly competitive with the upper bound of supervised transfer learning. | - |
| dc.language | eng | - |
| dc.relation.ispartof | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | - |
| dc.title | Semantic-aware generative adversarial nets for unsupervised domain adaptation in chest X-ray segmentation | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1007/978-3-030-00919-9_17 | - |
| dc.identifier.scopus | eid_2-s2.0-85054536895 | - |
| dc.identifier.volume | 11046 LNCS | - |
| dc.identifier.spage | 143 | - |
| dc.identifier.epage | 151 | - |
| dc.identifier.eissn | 1611-3349 | - |
| dc.identifier.isi | WOS:000477767800017 | - |
