File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Embarrassingly parallel inference for Gaussian processes

TitleEmbarrassingly parallel inference for Gaussian processes
Authors
KeywordsBayesian non-parametrics
Parallel inference
Gaussian process
Machine learning
Issue Date2019
PublisherJournal of Machine Learning Research. The Journal's web site is located at https://jmlr.org/
Citation
Journal of Machine Learning Research, 2019, v. 20, article no. 169 How to Cite?
AbstractTraining Gaussian process-based models typically involves an O(N3) computational bottleneck due to inverting the covariance matrix. Popular methods for overcoming this matrix inversion problem cannot adequately model all types of latent functions, and are often not parallelizable. However, judicious choice of model structure can ameliorate this problem. A mixture-of-experts model that uses a mixture of K Gaussian processes offers modeling flexibility and opportunities for scalable inference. Our embarrassingly parallel algorithm combines low-dimensional matrix inversions with importance sampling to yield a flexible, scalable mixture-of-experts model that offers comparable performance to Gaussian process regression at a much lower computational cost.
Persistent Identifierhttp://hdl.handle.net/10722/296197
ISSN
2023 Impact Factor: 4.3
2023 SCImago Journal Rankings: 2.796
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZhang, Michael Minyi-
dc.contributor.authorWilliamson, Sinead A.-
dc.date.accessioned2021-02-11T04:53:02Z-
dc.date.available2021-02-11T04:53:02Z-
dc.date.issued2019-
dc.identifier.citationJournal of Machine Learning Research, 2019, v. 20, article no. 169-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10722/296197-
dc.description.abstractTraining Gaussian process-based models typically involves an O(N3) computational bottleneck due to inverting the covariance matrix. Popular methods for overcoming this matrix inversion problem cannot adequately model all types of latent functions, and are often not parallelizable. However, judicious choice of model structure can ameliorate this problem. A mixture-of-experts model that uses a mixture of K Gaussian processes offers modeling flexibility and opportunities for scalable inference. Our embarrassingly parallel algorithm combines low-dimensional matrix inversions with importance sampling to yield a flexible, scalable mixture-of-experts model that offers comparable performance to Gaussian process regression at a much lower computational cost.-
dc.languageeng-
dc.publisherJournal of Machine Learning Research. The Journal's web site is located at https://jmlr.org/-
dc.relation.ispartofJournal of Machine Learning Research-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectBayesian non-parametrics-
dc.subjectParallel inference-
dc.subjectGaussian process-
dc.subjectMachine learning-
dc.titleEmbarrassingly parallel inference for Gaussian processes-
dc.typeArticle-
dc.description.naturepublished_or_final_version-
dc.identifier.scopuseid_2-s2.0-85077643497-
dc.identifier.volume20-
dc.identifier.spagearticle no. 169-
dc.identifier.epagearticle no. 169-
dc.identifier.eissn1533-7928-
dc.identifier.isiWOS:000506403100009-
dc.publisher.placeUnited States-
dc.identifier.issnl1532-4435-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats