File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Prior Guided Feature Enrichment Network for Few-Shot Segmentation

TitlePrior Guided Feature Enrichment Network for Few-Shot Segmentation
Authors
Keywordsfew-shot learning
Few-shot segmentation
scene understanding
semantic segmentation
Issue Date2022
Citation
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, v. 44, n. 2, p. 1050-1065 How to Cite?
AbstractState-of-the-art semantic segmentation methods require sufficient labeled data to achieve good results and hardly work on unseen classes without fine-tuning. Few-shot segmentation is thus proposed to tackle this problem by learning a model that quickly adapts to new classes with a few labeled support samples. Theses frameworks still face the challenge of generalization ability reduction on unseen classes due to inappropriate use of high-level semantic information of training classes and spatial inconsistency between query and support targets. To alleviate these issues, we propose the Prior Guided Feature Enrichment Network (PFENet). It consists of novel designs of (1) a training-free prior mask generation method that not only retains generalization power but also improves model performance and (2) Feature Enrichment Module (FEM) that overcomes spatial inconsistency by adaptively enriching query features with support features and prior masks. Extensive experiments on PASCAL-5$^i$i and COCO prove that the proposed prior generation method and FEM both improve the baseline method significantly. Our PFENet also outperforms state-of-the-art methods by a large margin without efficiency loss. It is surprising that our model even generalizes to cases without labeled support samples.
Persistent Identifierhttp://hdl.handle.net/10722/333525
ISSN
2021 Impact Factor: 24.314
2020 SCImago Journal Rankings: 3.811

 

DC FieldValueLanguage
dc.contributor.authorTian, Zhuotao-
dc.contributor.authorZhao, Hengshuang-
dc.contributor.authorShu, Michelle-
dc.contributor.authorYang, Zhicheng-
dc.contributor.authorLi, Ruiyu-
dc.contributor.authorJia, Jiaya-
dc.date.accessioned2023-10-06T05:20:10Z-
dc.date.available2023-10-06T05:20:10Z-
dc.date.issued2022-
dc.identifier.citationIEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, v. 44, n. 2, p. 1050-1065-
dc.identifier.issn0162-8828-
dc.identifier.urihttp://hdl.handle.net/10722/333525-
dc.description.abstractState-of-the-art semantic segmentation methods require sufficient labeled data to achieve good results and hardly work on unseen classes without fine-tuning. Few-shot segmentation is thus proposed to tackle this problem by learning a model that quickly adapts to new classes with a few labeled support samples. Theses frameworks still face the challenge of generalization ability reduction on unseen classes due to inappropriate use of high-level semantic information of training classes and spatial inconsistency between query and support targets. To alleviate these issues, we propose the Prior Guided Feature Enrichment Network (PFENet). It consists of novel designs of (1) a training-free prior mask generation method that not only retains generalization power but also improves model performance and (2) Feature Enrichment Module (FEM) that overcomes spatial inconsistency by adaptively enriching query features with support features and prior masks. Extensive experiments on PASCAL-5$^i$i and COCO prove that the proposed prior generation method and FEM both improve the baseline method significantly. Our PFENet also outperforms state-of-the-art methods by a large margin without efficiency loss. It is surprising that our model even generalizes to cases without labeled support samples.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Pattern Analysis and Machine Intelligence-
dc.subjectfew-shot learning-
dc.subjectFew-shot segmentation-
dc.subjectscene understanding-
dc.subjectsemantic segmentation-
dc.titlePrior Guided Feature Enrichment Network for Few-Shot Segmentation-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TPAMI.2020.3013717-
dc.identifier.pmid32750843-
dc.identifier.scopuseid_2-s2.0-85122805550-
dc.identifier.volume44-
dc.identifier.issue2-
dc.identifier.spage1050-
dc.identifier.epage1065-
dc.identifier.eissn1939-3539-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats