File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TMI.2025.3566105
- Scopus: eid_2-s2.0-105004215584
- Find via

Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: LLM-guided Decoupled Probabilistic Prompt for Continual Learning in Medical Image Diagnosis
| Title | LLM-guided Decoupled Probabilistic Prompt for Continual Learning in Medical Image Diagnosis |
|---|---|
| Authors | |
| Keywords | Continual learning expert knowledge prompt tuning |
| Issue Date | 1-May-2025 |
| Publisher | Institute of Electrical and Electronics Engineers |
| Citation | IEEE Transactions on Medical Imaging, 2025, v. 44, n. 8, p. 3439-3450 How to Cite? |
| Abstract | Deep learning-based traditional diagnostic models typically exhibit limitations when applied to dynamic clinical environments that require handling the emergence of new diseases. Continual learning (CL) offers a promising solution, aiming to learn new knowledge while preserving previously learned knowledge. Though recent rehearsal-free CL methods employing prompt tuning (PT) have shown promise, they rely on deterministic prompts that struggle to handle diverse fine-grained knowledge. Moreover, existing PT methods utilize randomly initialized prompts that are trained under standard classification constraints, impeding expert knowledge integration and optimal performance acquisition. In this paper, we propose an LLM-guided Decoupled Probabilistic Prompt (LDPP) for Continual Learning in medical image diagnosis. Specifically, we develop an Expert Knowledge Generation (EKG) module that leverages LLM to acquire decoupled expert knowledge and comprehensive category descriptions. Then, we introduce a Decoupled Probabilistic Prompt pool (DePP) to construct a shared decoupled probabilistic prompt pool, which constructs a shared prompt pool with probabilistic prompts derived from the expert knowledge set. These prompts dynamically provide diverse and flexible descriptions for input images. Finally, We design a Steering Prompt Pool (SPP) to enhance intra-class compactness and promote model performance by learning nonshared prompts. With extensive experimental validation, LDPP consistently sets state-of-the-art performance under the challenging class-incremental setting in CL. |
| Persistent Identifier | http://hdl.handle.net/10722/360807 |
| ISSN | 2023 Impact Factor: 8.9 2023 SCImago Journal Rankings: 3.703 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Luo, Yiwen | - |
| dc.contributor.author | Li, Wuyang | - |
| dc.contributor.author | Chen, Cheng | - |
| dc.contributor.author | Li, Xiang | - |
| dc.contributor.author | Liu, Tianming | - |
| dc.contributor.author | Niu, Tianye | - |
| dc.contributor.author | Yuan, Yixuan | - |
| dc.date.accessioned | 2025-09-16T00:30:37Z | - |
| dc.date.available | 2025-09-16T00:30:37Z | - |
| dc.date.issued | 2025-05-01 | - |
| dc.identifier.citation | IEEE Transactions on Medical Imaging, 2025, v. 44, n. 8, p. 3439-3450 | - |
| dc.identifier.issn | 0278-0062 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/360807 | - |
| dc.description.abstract | Deep learning-based traditional diagnostic models typically exhibit limitations when applied to dynamic clinical environments that require handling the emergence of new diseases. Continual learning (CL) offers a promising solution, aiming to learn new knowledge while preserving previously learned knowledge. Though recent rehearsal-free CL methods employing prompt tuning (PT) have shown promise, they rely on deterministic prompts that struggle to handle diverse fine-grained knowledge. Moreover, existing PT methods utilize randomly initialized prompts that are trained under standard classification constraints, impeding expert knowledge integration and optimal performance acquisition. In this paper, we propose an LLM-guided Decoupled Probabilistic Prompt (LDPP) for Continual Learning in medical image diagnosis. Specifically, we develop an Expert Knowledge Generation (EKG) module that leverages LLM to acquire decoupled expert knowledge and comprehensive category descriptions. Then, we introduce a Decoupled Probabilistic Prompt pool (DePP) to construct a shared decoupled probabilistic prompt pool, which constructs a shared prompt pool with probabilistic prompts derived from the expert knowledge set. These prompts dynamically provide diverse and flexible descriptions for input images. Finally, We design a Steering Prompt Pool (SPP) to enhance intra-class compactness and promote model performance by learning nonshared prompts. With extensive experimental validation, LDPP consistently sets state-of-the-art performance under the challenging class-incremental setting in CL. | - |
| dc.language | eng | - |
| dc.publisher | Institute of Electrical and Electronics Engineers | - |
| dc.relation.ispartof | IEEE Transactions on Medical Imaging | - |
| dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
| dc.subject | Continual learning | - |
| dc.subject | expert knowledge | - |
| dc.subject | prompt tuning | - |
| dc.title | LLM-guided Decoupled Probabilistic Prompt for Continual Learning in Medical Image Diagnosis | - |
| dc.type | Article | - |
| dc.identifier.doi | 10.1109/TMI.2025.3566105 | - |
| dc.identifier.scopus | eid_2-s2.0-105004215584 | - |
| dc.identifier.volume | 44 | - |
| dc.identifier.issue | 8 | - |
| dc.identifier.spage | 3439 | - |
| dc.identifier.epage | 3450 | - |
| dc.identifier.eissn | 1558-254X | - |
| dc.identifier.issnl | 0278-0062 | - |
