File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Restarting the accelerated coordinate descent method with a rough strong convexity estimate

TitleRestarting the accelerated coordinate descent method with a rough strong convexity estimate
Authors
KeywordsAccelerated coordinate descent
Restarting strategies
Unknown strong convexity
Local quadratic error bound
Issue Date2020
PublisherSpringer New York LLC. The Journal's web site is located at http://springerlink.metapress.com/openurl.asp?genre=journal&issn=0926-6003
Citation
Computational Optimization and Applications, 2020, v. 75, p. 63-91 How to Cite?
AbstractWe propose new restarting strategies for the accelerated coordinate descent method. Our main contribution is to show that for a well chosen sequence of restarting times, the restarted method has a nearly geometric rate of convergence. A major feature of the method is that it can take profit of the local quadratic error bound of the objective function without knowing the actual value of the error bound. We also show that under the more restrictive assumption that the objective function is strongly convex, any fixed restart period leads to a geometric rate of convergence. Finally, we illustrate the properties of the algorithm on a regularized logistic regression problem and on a Lasso problem.
Persistent Identifierhttp://hdl.handle.net/10722/290916
ISSN
2019 Impact Factor: 1.743
2015 SCImago Journal Rankings: 1.481

 

DC FieldValueLanguage
dc.contributor.authorFercoq, O-
dc.contributor.authorQu, Z-
dc.date.accessioned2020-11-02T05:48:55Z-
dc.date.available2020-11-02T05:48:55Z-
dc.date.issued2020-
dc.identifier.citationComputational Optimization and Applications, 2020, v. 75, p. 63-91-
dc.identifier.issn0926-6003-
dc.identifier.urihttp://hdl.handle.net/10722/290916-
dc.description.abstractWe propose new restarting strategies for the accelerated coordinate descent method. Our main contribution is to show that for a well chosen sequence of restarting times, the restarted method has a nearly geometric rate of convergence. A major feature of the method is that it can take profit of the local quadratic error bound of the objective function without knowing the actual value of the error bound. We also show that under the more restrictive assumption that the objective function is strongly convex, any fixed restart period leads to a geometric rate of convergence. Finally, we illustrate the properties of the algorithm on a regularized logistic regression problem and on a Lasso problem.-
dc.languageeng-
dc.publisherSpringer New York LLC. The Journal's web site is located at http://springerlink.metapress.com/openurl.asp?genre=journal&issn=0926-6003-
dc.relation.ispartofComputational Optimization and Applications-
dc.rightsThis is a post-peer-review, pre-copyedit version of an article published in [insert journal title]. The final authenticated version is available online at: https://doi.org/[insert DOI]-
dc.subjectAccelerated coordinate descent-
dc.subjectRestarting strategies-
dc.subjectUnknown strong convexity-
dc.subjectLocal quadratic error bound-
dc.titleRestarting the accelerated coordinate descent method with a rough strong convexity estimate-
dc.typeArticle-
dc.identifier.emailQu, Z: zhengqu@hku.hk-
dc.identifier.authorityQu, Z=rp02096-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1007/s10589-019-00137-2-
dc.identifier.scopuseid_2-s2.0-85074521068-
dc.identifier.hkuros317753-
dc.identifier.volume75-
dc.identifier.spage63-
dc.identifier.epage91-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats