File Download
Links for fulltext
(May Require Subscription)
- Scopus: eid_2-s2.0-85077643497
- WOS: WOS:000506403100009
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Embarrassingly parallel inference for Gaussian processes
Title | Embarrassingly parallel inference for Gaussian processes |
---|---|
Authors | |
Keywords | Bayesian non-parametrics Parallel inference Gaussian process Machine learning |
Issue Date | 2019 |
Publisher | Journal of Machine Learning Research. The Journal's web site is located at https://jmlr.org/ |
Citation | Journal of Machine Learning Research, 2019, v. 20, article no. 169 How to Cite? |
Abstract | Training Gaussian process-based models typically involves an O(N3) computational bottleneck due to inverting the covariance matrix. Popular methods for overcoming this matrix inversion problem cannot adequately model all types of latent functions, and are often not parallelizable. However, judicious choice of model structure can ameliorate this problem. A mixture-of-experts model that uses a mixture of K Gaussian processes offers modeling flexibility and opportunities for scalable inference. Our embarrassingly parallel algorithm combines low-dimensional matrix inversions with importance sampling to yield a flexible, scalable mixture-of-experts model that offers comparable performance to Gaussian process regression at a much lower computational cost. |
Persistent Identifier | http://hdl.handle.net/10722/296197 |
ISSN | 2023 Impact Factor: 4.3 2023 SCImago Journal Rankings: 2.796 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhang, Michael Minyi | - |
dc.contributor.author | Williamson, Sinead A. | - |
dc.date.accessioned | 2021-02-11T04:53:02Z | - |
dc.date.available | 2021-02-11T04:53:02Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | Journal of Machine Learning Research, 2019, v. 20, article no. 169 | - |
dc.identifier.issn | 1532-4435 | - |
dc.identifier.uri | http://hdl.handle.net/10722/296197 | - |
dc.description.abstract | Training Gaussian process-based models typically involves an O(N3) computational bottleneck due to inverting the covariance matrix. Popular methods for overcoming this matrix inversion problem cannot adequately model all types of latent functions, and are often not parallelizable. However, judicious choice of model structure can ameliorate this problem. A mixture-of-experts model that uses a mixture of K Gaussian processes offers modeling flexibility and opportunities for scalable inference. Our embarrassingly parallel algorithm combines low-dimensional matrix inversions with importance sampling to yield a flexible, scalable mixture-of-experts model that offers comparable performance to Gaussian process regression at a much lower computational cost. | - |
dc.language | eng | - |
dc.publisher | Journal of Machine Learning Research. The Journal's web site is located at https://jmlr.org/ | - |
dc.relation.ispartof | Journal of Machine Learning Research | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject | Bayesian non-parametrics | - |
dc.subject | Parallel inference | - |
dc.subject | Gaussian process | - |
dc.subject | Machine learning | - |
dc.title | Embarrassingly parallel inference for Gaussian processes | - |
dc.type | Article | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.scopus | eid_2-s2.0-85077643497 | - |
dc.identifier.volume | 20 | - |
dc.identifier.spage | article no. 169 | - |
dc.identifier.epage | article no. 169 | - |
dc.identifier.eissn | 1533-7928 | - |
dc.identifier.isi | WOS:000506403100009 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 1532-4435 | - |