File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: FFT-based exponentially weighted recursive least squares computations

TitleFFT-based exponentially weighted recursive least squares computations
Authors
Issue Date1997
Citation
Linear Algebra and Its Applications, 1997, v. 263, n. 1-3, p. 167-191 How to Cite?
AbstractWe consider exponentially weighted recursive least squares (RLS) computations with forgetting factor γ (0 < γ < 1). The least squares estimator can be found by solving a matrix system A(t)x(t) = b(t) at each adaptive time step t. Unlike the sliding window RLS computation, the matrix A(t) is not a "near-Toeplitz" matrix (a sum of products of Toeplitz matrices). However, we show that its scaled matrix is a "near-Toeplitz" matrix, and hence the matrix-vector multiplication can be performed efficiently by using fast Fourier transforms (FFTs). We apply the FFT-based preconditioned conjugate gradient method to solve such systems. When the input stochastic process is stationary, we prove that both ℰ[∥ A(t) - E(A(t)∥2] and Var[∥ A(t) -E( A(t))∥2] tend to zero, provided that the number of data samples taken is sufficient large. Here ℰ(·) and Var(·) are the expectation and variance operators respectively. Hence the expected values of the eigenvalues of the preconditioned matrices are near to 1 except for a finite number of outlying eigenvalues. The result is stronger than those proved by Ng, Chan, and Plemmons that the spectra of the preconditioned matrices are clustered around 1 with probability 1. © 1997 Elsevier Science Inc.
Persistent Identifierhttp://hdl.handle.net/10722/276733
ISSN
2023 Impact Factor: 1.0
2023 SCImago Journal Rankings: 0.837
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorNg, Michael K.-
dc.date.accessioned2019-09-18T08:34:29Z-
dc.date.available2019-09-18T08:34:29Z-
dc.date.issued1997-
dc.identifier.citationLinear Algebra and Its Applications, 1997, v. 263, n. 1-3, p. 167-191-
dc.identifier.issn0024-3795-
dc.identifier.urihttp://hdl.handle.net/10722/276733-
dc.description.abstractWe consider exponentially weighted recursive least squares (RLS) computations with forgetting factor γ (0 < γ < 1). The least squares estimator can be found by solving a matrix system A(t)x(t) = b(t) at each adaptive time step t. Unlike the sliding window RLS computation, the matrix A(t) is not a "near-Toeplitz" matrix (a sum of products of Toeplitz matrices). However, we show that its scaled matrix is a "near-Toeplitz" matrix, and hence the matrix-vector multiplication can be performed efficiently by using fast Fourier transforms (FFTs). We apply the FFT-based preconditioned conjugate gradient method to solve such systems. When the input stochastic process is stationary, we prove that both ℰ[∥ A(t) - E(A(t)∥2] and Var[∥ A(t) -E( A(t))∥2] tend to zero, provided that the number of data samples taken is sufficient large. Here ℰ(·) and Var(·) are the expectation and variance operators respectively. Hence the expected values of the eigenvalues of the preconditioned matrices are near to 1 except for a finite number of outlying eigenvalues. The result is stronger than those proved by Ng, Chan, and Plemmons that the spectra of the preconditioned matrices are clustered around 1 with probability 1. © 1997 Elsevier Science Inc.-
dc.languageeng-
dc.relation.ispartofLinear Algebra and Its Applications-
dc.titleFFT-based exponentially weighted recursive least squares computations-
dc.typeArticle-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.doi10.1016/S0024-3795(96)00532-0-
dc.identifier.scopuseid_2-s2.0-0038911578-
dc.identifier.volume263-
dc.identifier.issue1-3-
dc.identifier.spage167-
dc.identifier.epage191-
dc.identifier.isiWOS:A1997XG71100009-
dc.identifier.issnl0024-3795-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats