File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: LLM-guided Decoupled Probabilistic Prompt for Continual Learning in Medical Image Diagnosis

TitleLLM-guided Decoupled Probabilistic Prompt for Continual Learning in Medical Image Diagnosis
Authors
KeywordsContinual learning
expert knowledge
prompt tuning
Issue Date1-May-2025
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Medical Imaging, 2025, v. 44, n. 8, p. 3439-3450 How to Cite?
AbstractDeep learning-based traditional diagnostic models typically exhibit limitations when applied to dynamic clinical environments that require handling the emergence of new diseases. Continual learning (CL) offers a promising solution, aiming to learn new knowledge while preserving previously learned knowledge. Though recent rehearsal-free CL methods employing prompt tuning (PT) have shown promise, they rely on deterministic prompts that struggle to handle diverse fine-grained knowledge. Moreover, existing PT methods utilize randomly initialized prompts that are trained under standard classification constraints, impeding expert knowledge integration and optimal performance acquisition. In this paper, we propose an LLM-guided Decoupled Probabilistic Prompt (LDPP) for Continual Learning in medical image diagnosis. Specifically, we develop an Expert Knowledge Generation (EKG) module that leverages LLM to acquire decoupled expert knowledge and comprehensive category descriptions. Then, we introduce a Decoupled Probabilistic Prompt pool (DePP) to construct a shared decoupled probabilistic prompt pool, which constructs a shared prompt pool with probabilistic prompts derived from the expert knowledge set. These prompts dynamically provide diverse and flexible descriptions for input images. Finally, We design a Steering Prompt Pool (SPP) to enhance intra-class compactness and promote model performance by learning nonshared prompts. With extensive experimental validation, LDPP consistently sets state-of-the-art performance under the challenging class-incremental setting in CL.
Persistent Identifierhttp://hdl.handle.net/10722/360807
ISSN
2023 Impact Factor: 8.9
2023 SCImago Journal Rankings: 3.703

 

DC FieldValueLanguage
dc.contributor.authorLuo, Yiwen-
dc.contributor.authorLi, Wuyang-
dc.contributor.authorChen, Cheng-
dc.contributor.authorLi, Xiang-
dc.contributor.authorLiu, Tianming-
dc.contributor.authorNiu, Tianye-
dc.contributor.authorYuan, Yixuan-
dc.date.accessioned2025-09-16T00:30:37Z-
dc.date.available2025-09-16T00:30:37Z-
dc.date.issued2025-05-01-
dc.identifier.citationIEEE Transactions on Medical Imaging, 2025, v. 44, n. 8, p. 3439-3450-
dc.identifier.issn0278-0062-
dc.identifier.urihttp://hdl.handle.net/10722/360807-
dc.description.abstractDeep learning-based traditional diagnostic models typically exhibit limitations when applied to dynamic clinical environments that require handling the emergence of new diseases. Continual learning (CL) offers a promising solution, aiming to learn new knowledge while preserving previously learned knowledge. Though recent rehearsal-free CL methods employing prompt tuning (PT) have shown promise, they rely on deterministic prompts that struggle to handle diverse fine-grained knowledge. Moreover, existing PT methods utilize randomly initialized prompts that are trained under standard classification constraints, impeding expert knowledge integration and optimal performance acquisition. In this paper, we propose an LLM-guided Decoupled Probabilistic Prompt (LDPP) for Continual Learning in medical image diagnosis. Specifically, we develop an Expert Knowledge Generation (EKG) module that leverages LLM to acquire decoupled expert knowledge and comprehensive category descriptions. Then, we introduce a Decoupled Probabilistic Prompt pool (DePP) to construct a shared decoupled probabilistic prompt pool, which constructs a shared prompt pool with probabilistic prompts derived from the expert knowledge set. These prompts dynamically provide diverse and flexible descriptions for input images. Finally, We design a Steering Prompt Pool (SPP) to enhance intra-class compactness and promote model performance by learning nonshared prompts. With extensive experimental validation, LDPP consistently sets state-of-the-art performance under the challenging class-incremental setting in CL.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Transactions on Medical Imaging-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectContinual learning-
dc.subjectexpert knowledge-
dc.subjectprompt tuning-
dc.titleLLM-guided Decoupled Probabilistic Prompt for Continual Learning in Medical Image Diagnosis -
dc.typeArticle-
dc.identifier.doi10.1109/TMI.2025.3566105-
dc.identifier.scopuseid_2-s2.0-105004215584-
dc.identifier.volume44-
dc.identifier.issue8-
dc.identifier.spage3439-
dc.identifier.epage3450-
dc.identifier.eissn1558-254X-
dc.identifier.issnl0278-0062-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats