File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning

TitleContinual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning
Authors
Keywordscontinual learning
contrastive learning
knowledge distillation
Nuclei segmentation
Issue Date2023
Citation
IEEE Transactions on Medical Imaging, 2023, v. 42, n. 12, p. 3794-3804 How to Cite?
AbstractDeep learning models have achieved remarkable success in multi-type nuclei segmentation. These models are mostly trained at once with the full annotation of all types of nuclei available, while lack the ability of continually learning new classes due to the problem of catastrophic forgetting. In this paper, we study the practical and important class-incremental continual learning problem, where the model is incrementally updated to new classes without accessing to previous data. We propose a novel continual nuclei segmentation method, to avoid forgetting knowledge of old classes and facilitate the learning of new classes, by achieving feature-level knowledge distillation with prototype-wise relation distillation and contrastive learning. Concretely, prototype-wise relation distillation imposes constraints on the inter-class relation similarity, encouraging the encoder to extract similar class distribution for old classes in the feature space. Prototype-wise contrastive learning with a hard sampling strategy enhances the intra-class compactness and inter-class separability of features, improving the performance on both old and new classes. Experiments on two multi-type nuclei segmentation benchmarks, i.e., MoNuSAC and CoNSeP, demonstrate the effectiveness of our method with superior performance over many competitive methods. Codes are available at https://github.com/zzw-szu/CoNuSeg.
Persistent Identifierhttp://hdl.handle.net/10722/349956
ISSN
2023 Impact Factor: 8.9
2023 SCImago Journal Rankings: 3.703
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorWu, Huisi-
dc.contributor.authorWang, Zhaoze-
dc.contributor.authorZhao, Zebin-
dc.contributor.authorChen, Cheng-
dc.contributor.authorQin, Jing-
dc.date.accessioned2024-10-17T07:02:07Z-
dc.date.available2024-10-17T07:02:07Z-
dc.date.issued2023-
dc.identifier.citationIEEE Transactions on Medical Imaging, 2023, v. 42, n. 12, p. 3794-3804-
dc.identifier.issn0278-0062-
dc.identifier.urihttp://hdl.handle.net/10722/349956-
dc.description.abstractDeep learning models have achieved remarkable success in multi-type nuclei segmentation. These models are mostly trained at once with the full annotation of all types of nuclei available, while lack the ability of continually learning new classes due to the problem of catastrophic forgetting. In this paper, we study the practical and important class-incremental continual learning problem, where the model is incrementally updated to new classes without accessing to previous data. We propose a novel continual nuclei segmentation method, to avoid forgetting knowledge of old classes and facilitate the learning of new classes, by achieving feature-level knowledge distillation with prototype-wise relation distillation and contrastive learning. Concretely, prototype-wise relation distillation imposes constraints on the inter-class relation similarity, encouraging the encoder to extract similar class distribution for old classes in the feature space. Prototype-wise contrastive learning with a hard sampling strategy enhances the intra-class compactness and inter-class separability of features, improving the performance on both old and new classes. Experiments on two multi-type nuclei segmentation benchmarks, i.e., MoNuSAC and CoNSeP, demonstrate the effectiveness of our method with superior performance over many competitive methods. Codes are available at https://github.com/zzw-szu/CoNuSeg.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Medical Imaging-
dc.subjectcontinual learning-
dc.subjectcontrastive learning-
dc.subjectknowledge distillation-
dc.subjectNuclei segmentation-
dc.titleContinual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TMI.2023.3307892-
dc.identifier.pmid37610902-
dc.identifier.scopuseid_2-s2.0-85168740673-
dc.identifier.volume42-
dc.identifier.issue12-
dc.identifier.spage3794-
dc.identifier.epage3804-
dc.identifier.eissn1558-254X-
dc.identifier.isiWOS:001122030500010-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats