File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/1631272.1631283
- Scopus: eid_2-s2.0-72549087420
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Using large-scale web data to facilitate textual query based retrieval of consumer photos
Title | Using large-scale web data to facilitate textual query based retrieval of consumer photos |
---|---|
Authors | |
Keywords | Cross domain learning Large-scale web data Textual query based consumer photo retrieval |
Issue Date | 2009 |
Citation | MM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums, 2009, p. 55-64 How to Cite? |
Abstract | The rapid popularization of digital cameras and mobile phone cameras has lead to an explosive growth of consumer photo collections. In this paper, we present a (quasi) real-time textual query based personal photo retrieval system by leveraging millions of web images and their associated rich textual descriptions (captions, categories, etc.). After a user provides a textual query (e.g., "pool"), our system exploits the inverted file method to automatically find the positive web images that are related to the textual query "pool" as well as the negative web images which are irrelevant to the textual query. Based on these automatically retrieved relevant and irrelevant web images, we employ two simple but effective classification methods, k Nearest Neighbor (kNN) and decision stumps, to rank personal consumer photos. To further improve the photo retrieval performance, we propose three new relevance feedback methods via cross-domain learning. These methods effectively utilize both the web images and the consumer images. In particular, our proposed cross-domain learning methods can learn robust classifiers with only a very limited amount of labeled consumer photos from the user by leveraging the pre-learned decision stumps at interactive response time. Extensive experiments on both consumer and professional stock photo datasets demonstrated the effectiveness and efficiency of our system, which is also inherently not limited by any predefined lexicon. Copyright 2009 ACM. |
Persistent Identifier | http://hdl.handle.net/10722/321394 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, Yiming | - |
dc.contributor.author | Xu, Dong | - |
dc.contributor.author | Tsang, Ivor W. | - |
dc.contributor.author | Luo, Jiebo | - |
dc.date.accessioned | 2022-11-03T02:18:37Z | - |
dc.date.available | 2022-11-03T02:18:37Z | - |
dc.date.issued | 2009 | - |
dc.identifier.citation | MM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums, 2009, p. 55-64 | - |
dc.identifier.uri | http://hdl.handle.net/10722/321394 | - |
dc.description.abstract | The rapid popularization of digital cameras and mobile phone cameras has lead to an explosive growth of consumer photo collections. In this paper, we present a (quasi) real-time textual query based personal photo retrieval system by leveraging millions of web images and their associated rich textual descriptions (captions, categories, etc.). After a user provides a textual query (e.g., "pool"), our system exploits the inverted file method to automatically find the positive web images that are related to the textual query "pool" as well as the negative web images which are irrelevant to the textual query. Based on these automatically retrieved relevant and irrelevant web images, we employ two simple but effective classification methods, k Nearest Neighbor (kNN) and decision stumps, to rank personal consumer photos. To further improve the photo retrieval performance, we propose three new relevance feedback methods via cross-domain learning. These methods effectively utilize both the web images and the consumer images. In particular, our proposed cross-domain learning methods can learn robust classifiers with only a very limited amount of labeled consumer photos from the user by leveraging the pre-learned decision stumps at interactive response time. Extensive experiments on both consumer and professional stock photo datasets demonstrated the effectiveness and efficiency of our system, which is also inherently not limited by any predefined lexicon. Copyright 2009 ACM. | - |
dc.language | eng | - |
dc.relation.ispartof | MM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums | - |
dc.subject | Cross domain learning | - |
dc.subject | Large-scale web data | - |
dc.subject | Textual query based consumer photo retrieval | - |
dc.title | Using large-scale web data to facilitate textual query based retrieval of consumer photos | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1145/1631272.1631283 | - |
dc.identifier.scopus | eid_2-s2.0-72549087420 | - |
dc.identifier.spage | 55 | - |
dc.identifier.epage | 64 | - |