File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: ARR-GCN: Anatomy-Relation Reasoning Graph Convolutional Network for Automatic Fine-Grained Segmentation of Organ's Surgical Anatomy

TitleARR-GCN: Anatomy-Relation Reasoning Graph Convolutional Network for Automatic Fine-Grained Segmentation of Organ's Surgical Anatomy
Authors
Keywordsanatomy-relation reasoning graph convolutional network
computer-aided
Fine-grained segmentation of an organ's surgical anatomy
prior anatomic relation
Issue Date26-Apr-2023
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Journal of Biomedical and Health Informatics, 2023, v. 27, n. 7, p. 3258-3269 How to Cite?
AbstractAnatomical resection (AR) based on anatomical sub-regions is a promising method of precise surgical resection, which has been proven to improve long-term survival by reducing local recurrence. The fine-grained segmentation of an organ's surgical anatomy (FGS-OSA), i.e., segmenting an organ into multiple anatomic regions, is critical for localizing tumors in AR surgical planning. However, automatically obtaining FGS-OSA results in computer-aided methods faces the challenges of appearance ambiguities among sub-regions (i.e., inter-sub-region appearance ambiguities) caused by similar HU distributions in different sub-regions of an organ's surgical anatomy, invisible boundaries, and similarities between anatomical landmarks and other anatomical information. In this paper, we propose a novel fine-grained segmentation framework termed the “anatomic relation reasoning graph convolutional network” (ARR-GCN), which incorporates prior anatomic relations into the framework learning. In ARR-GCN, a graph is constructed based on the sub-regions to model the class and their relations. Further, to obtain discriminative initial node representations of graph space, a sub-region center module is designed. Most importantly, to explicitly learn the anatomic relations, the prior anatomic-relations among the sub-regions are encoded in the form of an adjacency matrix and embedded into the intermediate node representations to guide framework learning. The ARR-GCN was validated on two FGS-OSA tasks: i) liver segments segmentation, and ii) lung lobes segmentation. Experimental results on both tasks outperformed other state-of-the-art segmentation methods and yielded promising performances by ARR-GCN for suppressing ambiguities among sub-regions.
Persistent Identifierhttp://hdl.handle.net/10722/331042
ISSN
2021 Impact Factor: 7.021
2020 SCImago Journal Rankings: 1.293

 

DC FieldValueLanguage
dc.contributor.authorTian, Y-
dc.contributor.authorQin, W-
dc.contributor.authorXue, F-
dc.contributor.authorLambo, R-
dc.contributor.authorYue, M-
dc.contributor.authorDiao, S-
dc.contributor.authorYu, L-
dc.contributor.authorXie, Y-
dc.contributor.authorCao, H-
dc.contributor.authorLi, S-
dc.date.accessioned2023-09-21T06:52:17Z-
dc.date.available2023-09-21T06:52:17Z-
dc.date.issued2023-04-26-
dc.identifier.citationIEEE Journal of Biomedical and Health Informatics, 2023, v. 27, n. 7, p. 3258-3269-
dc.identifier.issn2168-2194-
dc.identifier.urihttp://hdl.handle.net/10722/331042-
dc.description.abstractAnatomical resection (AR) based on anatomical sub-regions is a promising method of precise surgical resection, which has been proven to improve long-term survival by reducing local recurrence. The fine-grained segmentation of an organ's surgical anatomy (FGS-OSA), i.e., segmenting an organ into multiple anatomic regions, is critical for localizing tumors in AR surgical planning. However, automatically obtaining FGS-OSA results in computer-aided methods faces the challenges of appearance ambiguities among sub-regions (i.e., inter-sub-region appearance ambiguities) caused by similar HU distributions in different sub-regions of an organ's surgical anatomy, invisible boundaries, and similarities between anatomical landmarks and other anatomical information. In this paper, we propose a novel fine-grained segmentation framework termed the “anatomic relation reasoning graph convolutional network” (ARR-GCN), which incorporates prior anatomic relations into the framework learning. In ARR-GCN, a graph is constructed based on the sub-regions to model the class and their relations. Further, to obtain discriminative initial node representations of graph space, a sub-region center module is designed. Most importantly, to explicitly learn the anatomic relations, the prior anatomic-relations among the sub-regions are encoded in the form of an adjacency matrix and embedded into the intermediate node representations to guide framework learning. The ARR-GCN was validated on two FGS-OSA tasks: i) liver segments segmentation, and ii) lung lobes segmentation. Experimental results on both tasks outperformed other state-of-the-art segmentation methods and yielded promising performances by ARR-GCN for suppressing ambiguities among sub-regions.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Journal of Biomedical and Health Informatics-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectanatomy-relation reasoning graph convolutional network-
dc.subjectcomputer-aided-
dc.subjectFine-grained segmentation of an organ's surgical anatomy-
dc.subjectprior anatomic relation-
dc.titleARR-GCN: Anatomy-Relation Reasoning Graph Convolutional Network for Automatic Fine-Grained Segmentation of Organ's Surgical Anatomy-
dc.typeArticle-
dc.identifier.doi10.1109/JBHI.2023.3270664-
dc.identifier.scopuseid_2-s2.0-85159656587-
dc.identifier.volume27-
dc.identifier.issue7-
dc.identifier.spage3258-
dc.identifier.epage3269-
dc.identifier.eissn2168-2208-
dc.identifier.issnl2168-2194-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats