File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: CATVI: Conditional and Adaptively Truncated Variational Inference for Hierarchical Bayesian Nonparametric Models

TitleCATVI: Conditional and Adaptively Truncated Variational Inference for Hierarchical Bayesian Nonparametric Models
Authors
Issue Date2022
Citation
Proceedings of Machine Learning Research, 2022, v. 151, p. 3647-3662 How to Cite?
AbstractCurrent variational inference methods for hierarchical Bayesian nonparametric models can neither characterize the correlation structure among latent variables due to the mean-field setting, nor infer the true posterior dimension because of the universal truncation. To overcome these limitations, we propose the conditional and adaptively truncated variational inference method (CATVI) by maximizing the nonparametric evidence lower bound and integrating Monte Carlo into the variational inference framework. CATVI enjoys several advantages over traditional methods, including a smaller divergence between variational and true posteriors, reduced risk of underfitting or overfitting, and improved prediction accuracy. Empirical studies on three large datasets reveal that CATVI applied in Bayesian nonparametric topic models substantially outperforms competing models, providing lower perplexity and clearer topic-words clustering.
Persistent Identifierhttp://hdl.handle.net/10722/336368

 

DC FieldValueLanguage
dc.contributor.authorLiu, Yirui-
dc.contributor.authorQiao, Xinghao-
dc.contributor.authorLam, Jessica-
dc.date.accessioned2024-01-15T08:26:13Z-
dc.date.available2024-01-15T08:26:13Z-
dc.date.issued2022-
dc.identifier.citationProceedings of Machine Learning Research, 2022, v. 151, p. 3647-3662-
dc.identifier.urihttp://hdl.handle.net/10722/336368-
dc.description.abstractCurrent variational inference methods for hierarchical Bayesian nonparametric models can neither characterize the correlation structure among latent variables due to the mean-field setting, nor infer the true posterior dimension because of the universal truncation. To overcome these limitations, we propose the conditional and adaptively truncated variational inference method (CATVI) by maximizing the nonparametric evidence lower bound and integrating Monte Carlo into the variational inference framework. CATVI enjoys several advantages over traditional methods, including a smaller divergence between variational and true posteriors, reduced risk of underfitting or overfitting, and improved prediction accuracy. Empirical studies on three large datasets reveal that CATVI applied in Bayesian nonparametric topic models substantially outperforms competing models, providing lower perplexity and clearer topic-words clustering.-
dc.languageeng-
dc.relation.ispartofProceedings of Machine Learning Research-
dc.titleCATVI: Conditional and Adaptively Truncated Variational Inference for Hierarchical Bayesian Nonparametric Models-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85149632699-
dc.identifier.volume151-
dc.identifier.spage3647-
dc.identifier.epage3662-
dc.identifier.eissn2640-3498-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats