File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Recovery guarantee of weighted low-rank approximation via alternating minimization

TitleRecovery guarantee of weighted low-rank approximation via alternating minimization
Authors
Issue Date2016
Citation
33rd International Conference on Machine Learning, ICML 2016, 2016, v. 5, p. 3504-3526 How to Cite?
AbstractMany applications require recovering a ground truth low-rank matrix from noisy observations of the entries, which in practice is typically formulated as a weighted low-rank approximation problem and solved by non-convex optimization heuristics such as alternating minimization. In this paper, we provide provable recovery guarantee of weighted low-rank via a simple alternating minimization algorithm. In particular, for a natural class of matrices and weights and without any assumption on the noise, we bound the spectral norm of the difference between the recovered matrix and the ground truth, by the spectral norm of the weighted noise plus an additive error term that decreases exponentially with the number of rounds of alternating minimization, from either initialization by SVD or, more importantly, random initialization. These provide the first theoretical results for weighted low-rank approximation via alternating minimization with non- binary deterministic weights, significantly generalizing those for matrix completion, the special case with binary weights, since our assumptions are similar or weaker than those made in existing works. Furthermore, this is achieved by a very simple algorithm that improves the vanilla alternating minimization with a simple clipping step.copyright
Persistent Identifierhttp://hdl.handle.net/10722/341196

 

DC FieldValueLanguage
dc.contributor.authorLi, Yuanzhi-
dc.contributor.authorLiang, Yingyu-
dc.contributor.authorRisteski, Andrej-
dc.date.accessioned2024-03-13T08:40:56Z-
dc.date.available2024-03-13T08:40:56Z-
dc.date.issued2016-
dc.identifier.citation33rd International Conference on Machine Learning, ICML 2016, 2016, v. 5, p. 3504-3526-
dc.identifier.urihttp://hdl.handle.net/10722/341196-
dc.description.abstractMany applications require recovering a ground truth low-rank matrix from noisy observations of the entries, which in practice is typically formulated as a weighted low-rank approximation problem and solved by non-convex optimization heuristics such as alternating minimization. In this paper, we provide provable recovery guarantee of weighted low-rank via a simple alternating minimization algorithm. In particular, for a natural class of matrices and weights and without any assumption on the noise, we bound the spectral norm of the difference between the recovered matrix and the ground truth, by the spectral norm of the weighted noise plus an additive error term that decreases exponentially with the number of rounds of alternating minimization, from either initialization by SVD or, more importantly, random initialization. These provide the first theoretical results for weighted low-rank approximation via alternating minimization with non- binary deterministic weights, significantly generalizing those for matrix completion, the special case with binary weights, since our assumptions are similar or weaker than those made in existing works. Furthermore, this is achieved by a very simple algorithm that improves the vanilla alternating minimization with a simple clipping step.copyright-
dc.languageeng-
dc.relation.ispartof33rd International Conference on Machine Learning, ICML 2016-
dc.titleRecovery guarantee of weighted low-rank approximation via alternating minimization-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-84998881881-
dc.identifier.volume5-
dc.identifier.spage3504-
dc.identifier.epage3526-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats