File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Conference Paper: Even faster accelerated coordinate descent using non-uniform sampling

TitleEven faster accelerated coordinate descent using non-uniform sampling
Authors
Issue Date2016
PublisherMIT Press. The Journal's web site is located at http://mitpress.mit.edu/jmlr
Citation
The 33 rd International Conference on Machine Learning (ICML 2016), New York, NY., 19-24 June 2016. In JMLR: Workshop and Conference Proceedings, 2016, v. 48, p. 1-10 How to Cite?
AbstractAccelerated coordinate descent is widely used in optimization due to its cheap per-iteration cost and scalability to large-scale problems. Up to a primal-dual transformation, it is also the same as accelerated stochastic gradient descent that is one of the central methods used in machine learning. In this paper, we improve the best known running time of accelerated coordinate descent by a factor up to square root of n. Our improvement is based on a clean, novel non-uniform sampling that selects each coordinate with a probability proportional to the square root of its smoothness parameter. Our proof technique also deviates from the classical estimation sequence technique used in prior work. Our speed-up applies to important problems such as empirical risk minimization and solving linear systems, both in theory and in practice.
DescriptionThis journal vol. entitled: Proceedings of the 33 rd International Conference on Machine Learning, ICML 2016
The full version of this paper can be found on http://arxiv.org/abs/1512.09103
Persistent Identifierhttp://hdl.handle.net/10722/235017
ISSN
2015 Impact Factor: 2.45
2015 SCImago Journal Rankings: 1.648

 

DC FieldValueLanguage
dc.contributor.authorAllen-Zhu, Z-
dc.contributor.authorQu, Z-
dc.contributor.authorRichtarik, P-
dc.contributor.authorYuan, Y-
dc.date.accessioned2016-10-14T13:50:44Z-
dc.date.available2016-10-14T13:50:44Z-
dc.date.issued2016-
dc.identifier.citationThe 33 rd International Conference on Machine Learning (ICML 2016), New York, NY., 19-24 June 2016. In JMLR: Workshop and Conference Proceedings, 2016, v. 48, p. 1-10-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10722/235017-
dc.descriptionThis journal vol. entitled: Proceedings of the 33 rd International Conference on Machine Learning, ICML 2016-
dc.descriptionThe full version of this paper can be found on http://arxiv.org/abs/1512.09103-
dc.description.abstractAccelerated coordinate descent is widely used in optimization due to its cheap per-iteration cost and scalability to large-scale problems. Up to a primal-dual transformation, it is also the same as accelerated stochastic gradient descent that is one of the central methods used in machine learning. In this paper, we improve the best known running time of accelerated coordinate descent by a factor up to square root of n. Our improvement is based on a clean, novel non-uniform sampling that selects each coordinate with a probability proportional to the square root of its smoothness parameter. Our proof technique also deviates from the classical estimation sequence technique used in prior work. Our speed-up applies to important problems such as empirical risk minimization and solving linear systems, both in theory and in practice.-
dc.languageeng-
dc.publisherMIT Press. The Journal's web site is located at http://mitpress.mit.edu/jmlr-
dc.relation.ispartofJournal of Machine Learning Research-
dc.rightsJournal of Machine Learning Research. Copyright © MIT Press.-
dc.rightsCreative Commons: Attribution 3.0 Hong Kong License-
dc.rightsAuthor holds the copyright-
dc.titleEven faster accelerated coordinate descent using non-uniform sampling-
dc.typeConference_Paper-
dc.identifier.emailQu, Z: zhengqu@hku.hk-
dc.identifier.authorityQu, Z=rp02096-
dc.description.naturepublished_or_final_version-
dc.identifier.hkuros269839-
dc.identifier.volume48-
dc.identifier.spage1-
dc.identifier.epage10-
dc.publisher.placeUnited States-
dc.customcontrol.immutablesml 161017 - full text embargo till 170601-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats