File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Sequential Gaussian Processes for Online Learning of Nonstationary Functions

TitleSequential Gaussian Processes for Online Learning of Nonstationary Functions
Authors
KeywordsGaussian processes
online learning
sequential Monte Carlo
Issue Date17-Apr-2023
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Signal Processing, 2023, v. 71, p. 1539-1550 How to Cite?
Abstract

Many machine learning problems can be framed in the context of estimating functions, and often these are time -dependent functions that are estimated in real-time as observations arrive. Gaussian processes (GPs) are an attractive choice for modeling real-valued nonlinear functions due to their flexibility and uncertainty quantification. However, the typical GP regression model suffers from several drawbacks: 1) Conventional GP inference scales O(N3) with respect to the number of observations; 2) Updating a GP model sequentially is not trivial; and 3) Covariance kernels typically enforce stationarity constraints on the function, while GPs with non-stationary covariance kernels are often intractable to use in practice. To overcome these issues, we propose a sequential Monte Carlo algorithm to fit infinite mixtures of GPs that capture non-stationary behavior while allowing for online, distributed inference. Our approach empirically improves performance over state-of-the-art methods for online GP estima-tion in the presence of non-stationarity in time-series data. To demonstrate the utility of our proposed online Gaussian process mixture-of-experts approach in applied settings, we show that we can sucessfully implement an optimization algorithm using online Gaussian process bandits.


Persistent Identifierhttp://hdl.handle.net/10722/332008
ISSN
2023 Impact Factor: 4.6
2023 SCImago Journal Rankings: 2.520
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZhang, M M-
dc.contributor.authorDumitrascu, B-
dc.contributor.authorWilliamson, S A-
dc.contributor.authorEngelhardt, B E-
dc.date.accessioned2023-09-28T05:00:13Z-
dc.date.available2023-09-28T05:00:13Z-
dc.date.issued2023-04-17-
dc.identifier.citationIEEE Transactions on Signal Processing, 2023, v. 71, p. 1539-1550-
dc.identifier.issn1053-587X-
dc.identifier.urihttp://hdl.handle.net/10722/332008-
dc.description.abstract<p>Many machine learning problems can be framed in the context of estimating functions, and often these are time -dependent functions that are estimated in real-time as observations arrive. Gaussian processes (GPs) are an attractive choice for modeling real-valued nonlinear functions due to their flexibility and uncertainty quantification. However, the typical GP regression model suffers from several drawbacks: 1) Conventional GP inference scales O(N3) with respect to the number of observations; 2) Updating a GP model sequentially is not trivial; and 3) Covariance kernels typically enforce stationarity constraints on the function, while GPs with non-stationary covariance kernels are often intractable to use in practice. To overcome these issues, we propose a sequential Monte Carlo algorithm to fit infinite mixtures of GPs that capture non-stationary behavior while allowing for online, distributed inference. Our approach empirically improves performance over state-of-the-art methods for online GP estima-tion in the presence of non-stationarity in time-series data. To demonstrate the utility of our proposed online Gaussian process mixture-of-experts approach in applied settings, we show that we can sucessfully implement an optimization algorithm using online Gaussian process bandits.<br></p>-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Transactions on Signal Processing-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectGaussian processes-
dc.subjectonline learning-
dc.subjectsequential Monte Carlo-
dc.titleSequential Gaussian Processes for Online Learning of Nonstationary Functions-
dc.typeArticle-
dc.identifier.doi10.1109/TSP.2023.3267992-
dc.identifier.scopuseid_2-s2.0-85153793906-
dc.identifier.volume71-
dc.identifier.spage1539-
dc.identifier.epage1550-
dc.identifier.eissn1941-0476-
dc.identifier.isiWOS:000982399100008-
dc.identifier.issnl1053-587X-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats