File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TPAMI.2018.2866563
- Scopus: eid_2-s2.0-85052679625
- PMID: 30136932
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Personalized Saliency and Its Prediction
Title | Personalized Saliency and Its Prediction |
---|---|
Authors | |
Keywords | convolutional neural network multi-task learning personalized saliency Universal saliency |
Issue Date | 2019 |
Citation | IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019, v. 41, n. 12, p. 2975-2989 How to Cite? |
Abstract | Nearly all existing visual saliency models by far have focused on predicting a universal saliency map across all observers. Yet psychology studies suggest that visual attention of different observers can vary significantly under specific circumstances, especially a scene is composed of multiple salient objects. To study such heterogenous visual attention pattern across observers, we first construct a personalized saliency dataset and explore correlations between visual attention, personal preferences, and image contents. Specifically, we propose to decompose a personalized saliency map (referred to as PSM) into a universal saliency map (referred to as USM) predictable by existing saliency detection models and a new discrepancy map across users that characterizes personalized saliency. We then present two solutions towards predicting such discrepancy maps, i.e., a multi-task convolutional neural network (CNN) framework and an extended CNN with Person-specific Information Encoded Filters (CNN-PIEF). Extensive experimental results demonstrate the effectiveness of our models for PSM prediction as well their generalization capability for unseen observers. |
Persistent Identifier | http://hdl.handle.net/10722/345103 |
ISSN | 2023 Impact Factor: 20.8 2023 SCImago Journal Rankings: 6.158 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Xu, Yanyu | - |
dc.contributor.author | Gao, Shenghua | - |
dc.contributor.author | Wu, Junru | - |
dc.contributor.author | Li, Nianyi | - |
dc.contributor.author | Yu, Jingyi | - |
dc.date.accessioned | 2024-08-15T09:25:16Z | - |
dc.date.available | 2024-08-15T09:25:16Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019, v. 41, n. 12, p. 2975-2989 | - |
dc.identifier.issn | 0162-8828 | - |
dc.identifier.uri | http://hdl.handle.net/10722/345103 | - |
dc.description.abstract | Nearly all existing visual saliency models by far have focused on predicting a universal saliency map across all observers. Yet psychology studies suggest that visual attention of different observers can vary significantly under specific circumstances, especially a scene is composed of multiple salient objects. To study such heterogenous visual attention pattern across observers, we first construct a personalized saliency dataset and explore correlations between visual attention, personal preferences, and image contents. Specifically, we propose to decompose a personalized saliency map (referred to as PSM) into a universal saliency map (referred to as USM) predictable by existing saliency detection models and a new discrepancy map across users that characterizes personalized saliency. We then present two solutions towards predicting such discrepancy maps, i.e., a multi-task convolutional neural network (CNN) framework and an extended CNN with Person-specific Information Encoded Filters (CNN-PIEF). Extensive experimental results demonstrate the effectiveness of our models for PSM prediction as well their generalization capability for unseen observers. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE Transactions on Pattern Analysis and Machine Intelligence | - |
dc.subject | convolutional neural network | - |
dc.subject | multi-task learning | - |
dc.subject | personalized saliency | - |
dc.subject | Universal saliency | - |
dc.title | Personalized Saliency and Its Prediction | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TPAMI.2018.2866563 | - |
dc.identifier.pmid | 30136932 | - |
dc.identifier.scopus | eid_2-s2.0-85052679625 | - |
dc.identifier.volume | 41 | - |
dc.identifier.issue | 12 | - |
dc.identifier.spage | 2975 | - |
dc.identifier.epage | 2989 | - |
dc.identifier.eissn | 1939-3539 | - |