File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Simple Stochastic and Online Gradient Descent Algorithms for Pairwise Learning

TitleSimple Stochastic and Online Gradient Descent Algorithms for Pairwise Learning
Authors
Issue Date2021
Citation
Advances in Neural Information Processing Systems, 2021, v. 24, p. 20160-20171 How to Cite?
AbstractPairwise learning refers to learning tasks where the loss function depends on a pair of instances. It instantiates many important machine learning tasks such as bipartite ranking and metric learning. A popular approach to handle streaming data in pairwise learning is an online gradient descent (OGD) algorithm, where one needs to pair the current instance with a buffering set of previous instances with a sufficiently large size and therefore suffers from a scalability issue. In this paper, we propose simple stochastic and online gradient descent methods for pairwise learning. A notable difference from the existing studies is that we only pair the current instance with the previous one in building a gradient direction, which is efficient in both the storage and computational complexity. We develop novel stability results, optimization, and generalization error bounds for both convex and nonconvex as well as both smooth and nonsmooth problems. We introduce novel techniques to decouple the dependency of models and the previous instance in both the optimization and generalization analysis. Our study resolves an open question on developing meaningful generalization bounds for OGD using a buffering set with a very small fixed size. We also extend our algorithms and stability analysis to develop differentially private SGD algorithms for pairwise learning which significantly improves the existing results.
Persistent Identifierhttp://hdl.handle.net/10722/329812
ISSN
2020 SCImago Journal Rankings: 1.399

 

DC FieldValueLanguage
dc.contributor.authorYang, Zhenhuan-
dc.contributor.authorLei, Yunwen-
dc.contributor.authorWang, Puyu-
dc.contributor.authorYang, Tianbao-
dc.contributor.authorYing, Yiming-
dc.date.accessioned2023-08-09T03:35:30Z-
dc.date.available2023-08-09T03:35:30Z-
dc.date.issued2021-
dc.identifier.citationAdvances in Neural Information Processing Systems, 2021, v. 24, p. 20160-20171-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/329812-
dc.description.abstractPairwise learning refers to learning tasks where the loss function depends on a pair of instances. It instantiates many important machine learning tasks such as bipartite ranking and metric learning. A popular approach to handle streaming data in pairwise learning is an online gradient descent (OGD) algorithm, where one needs to pair the current instance with a buffering set of previous instances with a sufficiently large size and therefore suffers from a scalability issue. In this paper, we propose simple stochastic and online gradient descent methods for pairwise learning. A notable difference from the existing studies is that we only pair the current instance with the previous one in building a gradient direction, which is efficient in both the storage and computational complexity. We develop novel stability results, optimization, and generalization error bounds for both convex and nonconvex as well as both smooth and nonsmooth problems. We introduce novel techniques to decouple the dependency of models and the previous instance in both the optimization and generalization analysis. Our study resolves an open question on developing meaningful generalization bounds for OGD using a buffering set with a very small fixed size. We also extend our algorithms and stability analysis to develop differentially private SGD algorithms for pairwise learning which significantly improves the existing results.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems-
dc.titleSimple Stochastic and Online Gradient Descent Algorithms for Pairwise Learning-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85130265605-
dc.identifier.volume24-
dc.identifier.spage20160-
dc.identifier.epage20171-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats