File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Recovery guarantee of non-negative matrix factorization via alternating updates

TitleRecovery guarantee of non-negative matrix factorization via alternating updates
Authors
Issue Date2016
Citation
Advances in Neural Information Processing Systems, 2016, p. 4994-5002 How to Cite?
AbstractNon-negative matrix factorization is a popular tool for decomposing data into feature and weight matrices under non-negativity constraints. It enjoys practical success but is poorly understood theoretically. This paper proposes an algorithm that alternates between decoding the weights and updating the features, and shows that assuming a generative model of the data, it provably recovers the ground-truth under fairly mild conditions. In particular, its only essential requirement on features is linear independence. Furthermore, the algorithm uses ReLU to exploit the non-negativity for decoding the weights, and thus can tolerate adversarial noise that can potentially be as large as the signal, and can tolerate unbiased noise much larger than the signal. The analysis relies on a carefully designed coupling between two potential functions, which we believe is of independent interest.
Persistent Identifierhttp://hdl.handle.net/10722/341205
ISSN
2020 SCImago Journal Rankings: 1.399

 

DC FieldValueLanguage
dc.contributor.authorLi, Yuanzhi-
dc.contributor.authorLiang, Yingyu-
dc.contributor.authorRisteski, Andrej-
dc.date.accessioned2024-03-13T08:41:00Z-
dc.date.available2024-03-13T08:41:00Z-
dc.date.issued2016-
dc.identifier.citationAdvances in Neural Information Processing Systems, 2016, p. 4994-5002-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/341205-
dc.description.abstractNon-negative matrix factorization is a popular tool for decomposing data into feature and weight matrices under non-negativity constraints. It enjoys practical success but is poorly understood theoretically. This paper proposes an algorithm that alternates between decoding the weights and updating the features, and shows that assuming a generative model of the data, it provably recovers the ground-truth under fairly mild conditions. In particular, its only essential requirement on features is linear independence. Furthermore, the algorithm uses ReLU to exploit the non-negativity for decoding the weights, and thus can tolerate adversarial noise that can potentially be as large as the signal, and can tolerate unbiased noise much larger than the signal. The analysis relies on a carefully designed coupling between two potential functions, which we believe is of independent interest.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems-
dc.titleRecovery guarantee of non-negative matrix factorization via alternating updates-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85018889908-
dc.identifier.spage4994-
dc.identifier.epage5002-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats