File Download
  Links for fulltext
     (May Require Subscription)

Conference Paper: Efficient learning for undirected topic models

TitleEfficient learning for undirected topic models
Authors
Issue Date2015
PublisherAssociation for Computational Linguistics (ACL).
Citation
The 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing (ACL-IJCNLP 2015), Beijing, China, 26-31 July 2015. In Conference Proceedings, 2015, v. 2, p. 162-167 How to Cite?
AbstractReplicated Softmax model, a well-known undirected topic model, is powerful in ex-tracting semantic representations of docu-ments. Traditional learning strategies such as Contrastive Divergence are very inef-ficient. This paper provides a novel esti-mator to speed up the learning based on Noise Contrastive Estimate, extended for documents of variant lengths and weighted inputs. Experiments on two benchmarks show that the new estimator achieves great learning efficiency and high accuracy on document retrieval and classification. © 2015 Association for Computational Linguistics.
Persistent Identifierhttp://hdl.handle.net/10722/232307
ISBN

 

DC FieldValueLanguage
dc.contributor.authorGu, J-
dc.contributor.authorLi, VOK-
dc.date.accessioned2016-09-20T05:29:06Z-
dc.date.available2016-09-20T05:29:06Z-
dc.date.issued2015-
dc.identifier.citationThe 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing (ACL-IJCNLP 2015), Beijing, China, 26-31 July 2015. In Conference Proceedings, 2015, v. 2, p. 162-167-
dc.identifier.isbn978-194164373-0-
dc.identifier.urihttp://hdl.handle.net/10722/232307-
dc.description.abstractReplicated Softmax model, a well-known undirected topic model, is powerful in ex-tracting semantic representations of docu-ments. Traditional learning strategies such as Contrastive Divergence are very inef-ficient. This paper provides a novel esti-mator to speed up the learning based on Noise Contrastive Estimate, extended for documents of variant lengths and weighted inputs. Experiments on two benchmarks show that the new estimator achieves great learning efficiency and high accuracy on document retrieval and classification. © 2015 Association for Computational Linguistics.-
dc.languageeng-
dc.publisherAssociation for Computational Linguistics (ACL).-
dc.relation.ispartofProceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Short Papers)-
dc.rightsCreative Commons: Attribution 3.0 Hong Kong License-
dc.titleEfficient learning for undirected topic models-
dc.typeConference_Paper-
dc.identifier.emailLi, VOK: vli@eee.hku.hk-
dc.identifier.authorityLi, VOK=rp00150-
dc.description.naturepostprint-
dc.identifier.scopuseid_2-s2.0-84944064715-
dc.identifier.hkuros265297-
dc.identifier.volume2-
dc.identifier.spage162-
dc.identifier.epage167-
dc.publisher.placeUnited States-
dc.customcontrol.immutablesml 161117-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats