File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Localization guided learning for pedestrian attribute recognition

TitleLocalization guided learning for pedestrian attribute recognition
Authors
Issue Date2019
Citation
British Machine Vision Conference 2018, BMVC 2018, 2019 How to Cite?
AbstractPedestrian attribute recognition has attracted many attentions due to its wide applications in scene understanding and person analysis from surveillance videos. Existing methods try to use additional pose, part or viewpoint information to complement the global feature representation for attribute classification. However, these methods face difficulties in localizing the areas corresponding to different attributes. To address this problem, we propose a novel Localization Guided Network which assigns attribute-specific weights to local features based on the affinity between proposals pre-extracted proposals and attribute locations. The advantage of our model is that our local features are learned automatically for each attribute and emphasized by the interaction with global features. We demonstrate the effectiveness of our Localization Guided Network on two pedestrian attribute benchmarks (PA-100K and RAP). Our result surpasses the previous state-of-the-art in all five metrics on both datasets.
Persistent Identifierhttp://hdl.handle.net/10722/316545

 

DC FieldValueLanguage
dc.contributor.authorLiu, Pengze-
dc.contributor.authorLiu, Xihui-
dc.contributor.authorYan, Junjie-
dc.contributor.authorShao, Jing-
dc.date.accessioned2022-09-14T11:40:43Z-
dc.date.available2022-09-14T11:40:43Z-
dc.date.issued2019-
dc.identifier.citationBritish Machine Vision Conference 2018, BMVC 2018, 2019-
dc.identifier.urihttp://hdl.handle.net/10722/316545-
dc.description.abstractPedestrian attribute recognition has attracted many attentions due to its wide applications in scene understanding and person analysis from surveillance videos. Existing methods try to use additional pose, part or viewpoint information to complement the global feature representation for attribute classification. However, these methods face difficulties in localizing the areas corresponding to different attributes. To address this problem, we propose a novel Localization Guided Network which assigns attribute-specific weights to local features based on the affinity between proposals pre-extracted proposals and attribute locations. The advantage of our model is that our local features are learned automatically for each attribute and emphasized by the interaction with global features. We demonstrate the effectiveness of our Localization Guided Network on two pedestrian attribute benchmarks (PA-100K and RAP). Our result surpasses the previous state-of-the-art in all five metrics on both datasets.-
dc.languageeng-
dc.relation.ispartofBritish Machine Vision Conference 2018, BMVC 2018-
dc.titleLocalization guided learning for pedestrian attribute recognition-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85084017798-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats