File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TMI.2023.3307892
- Scopus: eid_2-s2.0-85168740673
- PMID: 37610902
- WOS: WOS:001122030500010
- Find via

Supplementary
- Citations:
- Appears in Collections:
Article: Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning
| Title | Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning |
|---|---|
| Authors | |
| Keywords | continual learning contrastive learning knowledge distillation Nuclei segmentation |
| Issue Date | 2023 |
| Citation | IEEE Transactions on Medical Imaging, 2023, v. 42, n. 12, p. 3794-3804 How to Cite? |
| Abstract | Deep learning models have achieved remarkable success in multi-type nuclei segmentation. These models are mostly trained at once with the full annotation of all types of nuclei available, while lack the ability of continually learning new classes due to the problem of catastrophic forgetting. In this paper, we study the practical and important class-incremental continual learning problem, where the model is incrementally updated to new classes without accessing to previous data. We propose a novel continual nuclei segmentation method, to avoid forgetting knowledge of old classes and facilitate the learning of new classes, by achieving feature-level knowledge distillation with prototype-wise relation distillation and contrastive learning. Concretely, prototype-wise relation distillation imposes constraints on the inter-class relation similarity, encouraging the encoder to extract similar class distribution for old classes in the feature space. Prototype-wise contrastive learning with a hard sampling strategy enhances the intra-class compactness and inter-class separability of features, improving the performance on both old and new classes. Experiments on two multi-type nuclei segmentation benchmarks, i.e., MoNuSAC and CoNSeP, demonstrate the effectiveness of our method with superior performance over many competitive methods. Codes are available at https://github.com/zzw-szu/CoNuSeg. |
| Persistent Identifier | http://hdl.handle.net/10722/349956 |
| ISSN | 2023 Impact Factor: 8.9 2023 SCImago Journal Rankings: 3.703 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Wu, Huisi | - |
| dc.contributor.author | Wang, Zhaoze | - |
| dc.contributor.author | Zhao, Zebin | - |
| dc.contributor.author | Chen, Cheng | - |
| dc.contributor.author | Qin, Jing | - |
| dc.date.accessioned | 2024-10-17T07:02:07Z | - |
| dc.date.available | 2024-10-17T07:02:07Z | - |
| dc.date.issued | 2023 | - |
| dc.identifier.citation | IEEE Transactions on Medical Imaging, 2023, v. 42, n. 12, p. 3794-3804 | - |
| dc.identifier.issn | 0278-0062 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/349956 | - |
| dc.description.abstract | Deep learning models have achieved remarkable success in multi-type nuclei segmentation. These models are mostly trained at once with the full annotation of all types of nuclei available, while lack the ability of continually learning new classes due to the problem of catastrophic forgetting. In this paper, we study the practical and important class-incremental continual learning problem, where the model is incrementally updated to new classes without accessing to previous data. We propose a novel continual nuclei segmentation method, to avoid forgetting knowledge of old classes and facilitate the learning of new classes, by achieving feature-level knowledge distillation with prototype-wise relation distillation and contrastive learning. Concretely, prototype-wise relation distillation imposes constraints on the inter-class relation similarity, encouraging the encoder to extract similar class distribution for old classes in the feature space. Prototype-wise contrastive learning with a hard sampling strategy enhances the intra-class compactness and inter-class separability of features, improving the performance on both old and new classes. Experiments on two multi-type nuclei segmentation benchmarks, i.e., MoNuSAC and CoNSeP, demonstrate the effectiveness of our method with superior performance over many competitive methods. Codes are available at https://github.com/zzw-szu/CoNuSeg. | - |
| dc.language | eng | - |
| dc.relation.ispartof | IEEE Transactions on Medical Imaging | - |
| dc.subject | continual learning | - |
| dc.subject | contrastive learning | - |
| dc.subject | knowledge distillation | - |
| dc.subject | Nuclei segmentation | - |
| dc.title | Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1109/TMI.2023.3307892 | - |
| dc.identifier.pmid | 37610902 | - |
| dc.identifier.scopus | eid_2-s2.0-85168740673 | - |
| dc.identifier.volume | 42 | - |
| dc.identifier.issue | 12 | - |
| dc.identifier.spage | 3794 | - |
| dc.identifier.epage | 3804 | - |
| dc.identifier.eissn | 1558-254X | - |
| dc.identifier.isi | WOS:001122030500010 | - |
