File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Article: An ℓ eigenvector perturbation bound and its application to robust covariance estimation

TitleAn ℓ<inf>∞</inf> eigenvector perturbation bound and its application to robust covariance estimation
Authors
KeywordsApproximate factor model
Sparsity
Matrix perturbation theory
Incoherence
Low-rank matrices
Issue Date2018
Citation
Journal of Machine Learning Research, 2018, v. 18, article no. 207 How to Cite?
AbstractIn statistics and machine learning, we are interested in the eigenvectors (or singular vectors) of certain matrices (e.g. covariance matrices, data matrices, etc). However, those matrices are usually perturbed by noises or statistical errors, either from random sampling or structural patterns. The Davis-Kahan sin θ theorem is often used to bound the difference between the eigenvectors of a matrix A and those of a perturbed matrix A˜ = A + E, in terms of ℓ2 norm. In this paper, we prove that when A is a low-rank and incoherent matrix, the ℓ∞ norm perturbation bound of singular vectors (or eigenvectors in the symmetric case) is smaller by a factor of √d1 or √d2 for left and right vectors, where d1 and d2 are the matrix dimensions. The power of this new perturbation result is shown in robust covariance estimation, particularly when random variables have heavy tails. There, we propose new robust covariance estimators and establish their asymptotic properties using the newly developed perturbation bound. Our theoretical results are verified through extensive numerical experiments.
Persistent Identifierhttp://hdl.handle.net/10722/303565
ISSN
2022 Impact Factor: 6.0
2020 SCImago Journal Rankings: 1.240
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorFan, Jianqing-
dc.contributor.authorWang, Weichen-
dc.contributor.authorZhong, Yiqiao-
dc.date.accessioned2021-09-15T08:25:34Z-
dc.date.available2021-09-15T08:25:34Z-
dc.date.issued2018-
dc.identifier.citationJournal of Machine Learning Research, 2018, v. 18, article no. 207-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10722/303565-
dc.description.abstractIn statistics and machine learning, we are interested in the eigenvectors (or singular vectors) of certain matrices (e.g. covariance matrices, data matrices, etc). However, those matrices are usually perturbed by noises or statistical errors, either from random sampling or structural patterns. The Davis-Kahan sin θ theorem is often used to bound the difference between the eigenvectors of a matrix A and those of a perturbed matrix A˜ = A + E, in terms of ℓ2 norm. In this paper, we prove that when A is a low-rank and incoherent matrix, the ℓ∞ norm perturbation bound of singular vectors (or eigenvectors in the symmetric case) is smaller by a factor of √d1 or √d2 for left and right vectors, where d1 and d2 are the matrix dimensions. The power of this new perturbation result is shown in robust covariance estimation, particularly when random variables have heavy tails. There, we propose new robust covariance estimators and establish their asymptotic properties using the newly developed perturbation bound. Our theoretical results are verified through extensive numerical experiments.-
dc.languageeng-
dc.relation.ispartofJournal of Machine Learning Research-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectApproximate factor model-
dc.subjectSparsity-
dc.subjectMatrix perturbation theory-
dc.subjectIncoherence-
dc.subjectLow-rank matrices-
dc.titleAn ℓ<inf>∞</inf> eigenvector perturbation bound and its application to robust covariance estimation-
dc.typeArticle-
dc.description.naturepublished_or_final_version-
dc.identifier.scopuseid_2-s2.0-85048935983-
dc.identifier.volume18-
dc.identifier.spagearticle no. 207-
dc.identifier.epagearticle no. 207-
dc.identifier.eissn1533-7928-
dc.identifier.isiWOS:000435627900001-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats