File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Bayesian Finite Mixture Models for Probabilistic Context-Free Grammars

TitleBayesian Finite Mixture Models for Probabilistic Context-Free Grammars
Authors
KeywordsBayesian Finite Mixture Model
MCMC
Phrase Parsing
Issue Date2015
PublisherSpringer. The Proceedings' web site is located at http://link.springer.com/book/10.1007/978-3-319-18111-0
Citation
16th International Conference on Intelligent Text Processing and Computational Linguistics (CICLing 2015), Cairo, Egypt, 14-20 April 2015. In Alexander Gelbukh (Ed.), Computational Linguistics and Intelligent Text Processing: 16th International Conference, CICLing 2015, Cairo, Egypt, April 14-20, 2015, Proceedings, Part I, p. 201-212. Cham: Springer, 2015 How to Cite?
AbstractInstead of using a common PCFG to parse all texts, we present an efficient generative probabilistic model for the probabilistic context-free grammars(PCFGs) based on the Bayesian finite mixture model, where we assume that there are several PCFGs and each of these PCFGs share the same CFG but with different rule probabilities. Sentences of the same article in the corpus are generated from a common multinomial distribution over these PCFGs. We derive a Markov chain Monte Carlo algorithm for this model. In the experiments, our multi-grammar model outperforms both single grammar model and Inside-Outside algorithm.
Persistent Identifierhttp://hdl.handle.net/10722/218376
ISSN
2020 SCImago Journal Rankings: 0.249
ISI Accession Number ID
Series/Report no.Lecture Notes in Computer Science: v. 9041

 

DC FieldValueLanguage
dc.contributor.authorYu, PLH-
dc.contributor.authorTang, Y-
dc.date.accessioned2015-09-18T06:35:31Z-
dc.date.available2015-09-18T06:35:31Z-
dc.date.issued2015-
dc.identifier.citation16th International Conference on Intelligent Text Processing and Computational Linguistics (CICLing 2015), Cairo, Egypt, 14-20 April 2015. In Alexander Gelbukh (Ed.), Computational Linguistics and Intelligent Text Processing: 16th International Conference, CICLing 2015, Cairo, Egypt, April 14-20, 2015, Proceedings, Part I, p. 201-212. Cham: Springer, 2015-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/10722/218376-
dc.description.abstractInstead of using a common PCFG to parse all texts, we present an efficient generative probabilistic model for the probabilistic context-free grammars(PCFGs) based on the Bayesian finite mixture model, where we assume that there are several PCFGs and each of these PCFGs share the same CFG but with different rule probabilities. Sentences of the same article in the corpus are generated from a common multinomial distribution over these PCFGs. We derive a Markov chain Monte Carlo algorithm for this model. In the experiments, our multi-grammar model outperforms both single grammar model and Inside-Outside algorithm.-
dc.languageeng-
dc.publisherSpringer. The Proceedings' web site is located at http://link.springer.com/book/10.1007/978-3-319-18111-0-
dc.relation.ispartofComputational Linguistics and Intelligent Text Processing: 16th International Conference, CICLing 2015, Cairo, Egypt, April 14-20, 2015, Proceedings, Part I-
dc.relation.ispartofseriesLecture Notes in Computer Science: v. 9041-
dc.subjectBayesian Finite Mixture Model-
dc.subjectMCMC-
dc.subjectPhrase Parsing-
dc.titleBayesian Finite Mixture Models for Probabilistic Context-Free Grammars-
dc.typeConference_Paper-
dc.identifier.emailYu, PLH: plhyu@hku.hk-
dc.identifier.authorityYu, PLH=rp00835-
dc.identifier.doi10.1007/978-3-319-18111-0_16-
dc.identifier.scopuseid_2-s2.0-84942589128-
dc.identifier.hkuros250696-
dc.identifier.spage201-
dc.identifier.epage212-
dc.identifier.eissn1611-3349-
dc.identifier.isiWOS:000362441400016-
dc.publisher.placeCham-
dc.identifier.issnl0302-9743-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats