File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Recovery guarantees for quadratic tensors with sparse observations

TitleRecovery guarantees for quadratic tensors with sparse observations
Authors
Issue Date2020
Citation
AISTATS 2019 - 22nd International Conference on Artificial Intelligence and Statistics, 2020 How to Cite?
AbstractWe consider the tensor completion problem of predicting the missing entries of a tensor. The commonly used CP model has a triple product form, but an alternate family of quadratic models which are the sum of pairwise products instead of a triple product have emerged from applications such as recommendation systems. Non-convex methods are the method of choice for learning quadratic models, and this work examines their sample complexity and error guarantee. Our main result is that with the number of samples being only linear in the dimension, all local minima of the mean squared error objective are global minima and recover the original tensor. We substantiate our theoretical results with experiments on synthetic and real-world data.
Persistent Identifierhttp://hdl.handle.net/10722/341273

 

DC FieldValueLanguage
dc.contributor.authorZhang, Hongyang-
dc.contributor.authorSharan, Vatsal-
dc.contributor.authorCharikar, Moses-
dc.contributor.authorLiang, Yingyu-
dc.date.accessioned2024-03-13T08:41:31Z-
dc.date.available2024-03-13T08:41:31Z-
dc.date.issued2020-
dc.identifier.citationAISTATS 2019 - 22nd International Conference on Artificial Intelligence and Statistics, 2020-
dc.identifier.urihttp://hdl.handle.net/10722/341273-
dc.description.abstractWe consider the tensor completion problem of predicting the missing entries of a tensor. The commonly used CP model has a triple product form, but an alternate family of quadratic models which are the sum of pairwise products instead of a triple product have emerged from applications such as recommendation systems. Non-convex methods are the method of choice for learning quadratic models, and this work examines their sample complexity and error guarantee. Our main result is that with the number of samples being only linear in the dimension, all local minima of the mean squared error objective are global minima and recover the original tensor. We substantiate our theoretical results with experiments on synthetic and real-world data.-
dc.languageeng-
dc.relation.ispartofAISTATS 2019 - 22nd International Conference on Artificial Intelligence and Statistics-
dc.titleRecovery guarantees for quadratic tensors with sparse observations-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85085000367-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats