File Download

There are no files associated with this item.

Supplementary

Conference Paper: SAGA with Arbitrary Sampling

TitleSAGA with Arbitrary Sampling
Authors
Issue Date2019
PublisherInternational Machine Learning Society (IMLS). The Proceedings' web site is located at http://proceedings.mlr.press/
Citation
Proceedings of the 36th International Conference on Machine Learning (ICML 2019), Long Beach, CA, USA, 10-15 June 2019. In Proceedings of Machine Learning Research (PMLR), v. 97, p. 5190-5199 How to Cite?
AbstractWe study the problem of minimizing the average of a very large number of smooth functions, which is of key importance in training supervised learning models. One of the most celebrated methods in this context is the SAGA algorithm of Defazio et al. (2014). Despite years of research on the topic, a general-purpose version of SAGA—one that would include arbitrary importance sampling and minibatching schemes—does not exist. We remedy this situation and propose a general and flexible variant of SAGA following the arbitrary sampling paradigm. We perform an iteration complexity analysis of the method, largely possible due to the construction of new stochastic Lyapunov functions. We establish linear convergence rates in the smooth and strongly convex regime, and under certain error bound conditions also in a regime without strong convexity. Our rates match those of the primal-dual method Quartz (Qu et al., 2015) for which an arbitrary sampling analysis is available, which makes a significant step towards closing the gap in our understanding of complexity of primal and dual methods for finite sum problems. Finally, we show through experiments that specific variants of our general SAGA method can perform better in practice than other competing methods.
Persistent Identifierhttp://hdl.handle.net/10722/275289

 

DC FieldValueLanguage
dc.contributor.authorQian, X-
dc.contributor.authorQu, Z-
dc.contributor.authorRichtarik, P-
dc.date.accessioned2019-09-10T02:39:30Z-
dc.date.available2019-09-10T02:39:30Z-
dc.date.issued2019-
dc.identifier.citationProceedings of the 36th International Conference on Machine Learning (ICML 2019), Long Beach, CA, USA, 10-15 June 2019. In Proceedings of Machine Learning Research (PMLR), v. 97, p. 5190-5199-
dc.identifier.urihttp://hdl.handle.net/10722/275289-
dc.description.abstractWe study the problem of minimizing the average of a very large number of smooth functions, which is of key importance in training supervised learning models. One of the most celebrated methods in this context is the SAGA algorithm of Defazio et al. (2014). Despite years of research on the topic, a general-purpose version of SAGA—one that would include arbitrary importance sampling and minibatching schemes—does not exist. We remedy this situation and propose a general and flexible variant of SAGA following the arbitrary sampling paradigm. We perform an iteration complexity analysis of the method, largely possible due to the construction of new stochastic Lyapunov functions. We establish linear convergence rates in the smooth and strongly convex regime, and under certain error bound conditions also in a regime without strong convexity. Our rates match those of the primal-dual method Quartz (Qu et al., 2015) for which an arbitrary sampling analysis is available, which makes a significant step towards closing the gap in our understanding of complexity of primal and dual methods for finite sum problems. Finally, we show through experiments that specific variants of our general SAGA method can perform better in practice than other competing methods.-
dc.languageeng-
dc.publisherInternational Machine Learning Society (IMLS). The Proceedings' web site is located at http://proceedings.mlr.press/-
dc.relation.ispartofProceedings of Machine Learning Research (PMLR)-
dc.relation.ispartofThe 36th International Conference on Machine Learning (ICML 2019)-
dc.titleSAGA with Arbitrary Sampling-
dc.typeConference_Paper-
dc.identifier.emailQu, Z: zhengqu@hku.hk-
dc.identifier.authorityQu, Z=rp02096-
dc.identifier.hkuros304642-
dc.identifier.volume97-
dc.identifier.spage5190-
dc.identifier.epage5199-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats