File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/1871437.1871475
- Scopus: eid_2-s2.0-78651312555
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Decomposing background topics from keywords by Principal Component Pursuit
Title | Decomposing background topics from keywords by Principal Component Pursuit |
---|---|
Authors | |
Keywords | Latent Dirichlet Allocation Latent Semantic Indexing Perplexity Principal Component Pursuit Sparse keywords |
Issue Date | 2010 |
Citation | International Conference on Information and Knowledge Management, Proceedings, 2010, p. 269-277 How to Cite? |
Abstract | Low-dimensional topic models have been proven very useful for modeling a large corpus of documents that share a relatively small number of topics. Dimensionality reduction tools such as Principal Component Analysis or Latent Semantic Indexing (LSI) have been widely adopted for document modeling, analysis, and retrieval. In this paper, we contend that a more pertinent model for a document corpus as the combination of an (approximately) low-dimensional topic model for the corpus and a sparse model for the keywords of individual documents. For such a joint topic-document model, LSI or PCA is no longer appropriate to analyze the corpus data. We hence introduce a powerful new tool called Principal Component Pursuit that can effectively decompose the low-dimensional and the sparse components of such corpus data. We give empirical results on data synthesized with a Latent Dirichlet Allocation (LDA) mode to validate the new model. We then show that for real document data analysis, the new tool significantly reduces the perplexity and improves retrieval performance compared to classical baselines. © 2010 ACM. |
Persistent Identifier | http://hdl.handle.net/10722/326849 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Min, Kerui | - |
dc.contributor.author | Zhang, Zhengdong | - |
dc.contributor.author | Wright, John | - |
dc.contributor.author | Ma, Yi | - |
dc.date.accessioned | 2023-03-31T05:26:58Z | - |
dc.date.available | 2023-03-31T05:26:58Z | - |
dc.date.issued | 2010 | - |
dc.identifier.citation | International Conference on Information and Knowledge Management, Proceedings, 2010, p. 269-277 | - |
dc.identifier.uri | http://hdl.handle.net/10722/326849 | - |
dc.description.abstract | Low-dimensional topic models have been proven very useful for modeling a large corpus of documents that share a relatively small number of topics. Dimensionality reduction tools such as Principal Component Analysis or Latent Semantic Indexing (LSI) have been widely adopted for document modeling, analysis, and retrieval. In this paper, we contend that a more pertinent model for a document corpus as the combination of an (approximately) low-dimensional topic model for the corpus and a sparse model for the keywords of individual documents. For such a joint topic-document model, LSI or PCA is no longer appropriate to analyze the corpus data. We hence introduce a powerful new tool called Principal Component Pursuit that can effectively decompose the low-dimensional and the sparse components of such corpus data. We give empirical results on data synthesized with a Latent Dirichlet Allocation (LDA) mode to validate the new model. We then show that for real document data analysis, the new tool significantly reduces the perplexity and improves retrieval performance compared to classical baselines. © 2010 ACM. | - |
dc.language | eng | - |
dc.relation.ispartof | International Conference on Information and Knowledge Management, Proceedings | - |
dc.subject | Latent Dirichlet Allocation | - |
dc.subject | Latent Semantic Indexing | - |
dc.subject | Perplexity | - |
dc.subject | Principal Component Pursuit | - |
dc.subject | Sparse keywords | - |
dc.title | Decomposing background topics from keywords by Principal Component Pursuit | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1145/1871437.1871475 | - |
dc.identifier.scopus | eid_2-s2.0-78651312555 | - |
dc.identifier.spage | 269 | - |
dc.identifier.epage | 277 | - |