File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Conference Paper: Distributed, partially collapsed MCMC for Bayesian Nonparametrics

TitleDistributed, partially collapsed MCMC for Bayesian Nonparametrics
Authors
Issue Date2020
PublisherML Research Press. The Proceedings' web site is located at http://proceedings.mlr.press/
Citation
The 23rd International Conference on Artificial Intelligence and Statistics (AISTATS) 2020, Virtual Conference, Palermo, Italy, 26-28 August 2020. In Proceedings of Machine Learning Research (PMLR), v. 108, p. 3685-3695 How to Cite?
AbstractBayesian nonparametric (BNP) models provide elegant methods for discovering underlying latent features within a data set, but inference in such models can be slow. We exploit the fact that completely random measures, which commonly-used models like the Dirichlet process and the beta-Bernoulli process can be expressed using, are decomposable into independent sub-measures. We use this decomposition to partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components. We then select different inference algorithms for the two components: uncollapsed samplers mix well on the finite measure, while collapsed samplers mix well on the infinite, sparsely occupied tail. The resulting hybrid algorithm can be applied to a wide class of models, and can be easily distributed to allow scalable inference without sacrificing asymptotic convergence guarantees.
Persistent Identifierhttp://hdl.handle.net/10722/306011
ISSN

 

DC FieldValueLanguage
dc.contributor.authorDubey, KA-
dc.contributor.authorZhang, MM-
dc.contributor.authorXing, EP-
dc.contributor.authorWilliamson, SA-
dc.date.accessioned2021-10-20T10:17:33Z-
dc.date.available2021-10-20T10:17:33Z-
dc.date.issued2020-
dc.identifier.citationThe 23rd International Conference on Artificial Intelligence and Statistics (AISTATS) 2020, Virtual Conference, Palermo, Italy, 26-28 August 2020. In Proceedings of Machine Learning Research (PMLR), v. 108, p. 3685-3695-
dc.identifier.issn2640-3498-
dc.identifier.urihttp://hdl.handle.net/10722/306011-
dc.description.abstractBayesian nonparametric (BNP) models provide elegant methods for discovering underlying latent features within a data set, but inference in such models can be slow. We exploit the fact that completely random measures, which commonly-used models like the Dirichlet process and the beta-Bernoulli process can be expressed using, are decomposable into independent sub-measures. We use this decomposition to partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components. We then select different inference algorithms for the two components: uncollapsed samplers mix well on the finite measure, while collapsed samplers mix well on the infinite, sparsely occupied tail. The resulting hybrid algorithm can be applied to a wide class of models, and can be easily distributed to allow scalable inference without sacrificing asymptotic convergence guarantees.-
dc.languageeng-
dc.publisherML Research Press. The Proceedings' web site is located at http://proceedings.mlr.press/-
dc.relation.ispartofProceedings of Machine Learning Research (PMLR)-
dc.relation.ispartofThe 23rd International Conference on Artificial Intelligence and Statistics (AISTATS) 2020-
dc.titleDistributed, partially collapsed MCMC for Bayesian Nonparametrics-
dc.typeConference_Paper-
dc.identifier.emailZhang, MM: mzhang18@hku.hk-
dc.identifier.authorityZhang, MM=rp02776-
dc.identifier.hkuros327706-
dc.identifier.volume108: Proceedings of AISTATS 2020-
dc.identifier.spage3685-
dc.identifier.epage3695-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats