File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1162/NECO_a_00930
- Scopus: eid_2-s2.0-85013675736
- PMID: 28095196
- WOS: WOS:000395564100010
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Analysis of online composite mirror descent algorithm
Title | Analysis of online composite mirror descent algorithm |
---|---|
Authors | |
Issue Date | 2017 |
Citation | Neural Computation, 2017, v. 29, n. 3, p. 825-860 How to Cite? |
Abstract | We study the convergence of the online composite mirror descent algorithm, which involves a mirror map to reflect the geometry of the data and a convex objective function consisting of a loss and a regularizer possibly inducing sparsity.Our error analysis provides convergence rates in terms of properties of the strongly convex differentiable mirror map and the objective function. For a class of objective functions with Hölder continuous gradients, the convergence rates of the excess (regularized) risk under polynomially decaying step sizes have the order O(T-1/2 log T) after T iterates. Our results improve the existing error analysis for the online composite mirror descent algorithm by avoiding averaging and removing boundedness assumptions, and they sharpen the existing convergence rates of the last iterate for online gradient descent without any boundedness assumptions. Our methodology mainly depends on a novel error decomposition in terms of an excess Bregman distance, refined analysis of self-bounding properties of the objective function, and the resulting one-step progress bounds. |
Persistent Identifier | http://hdl.handle.net/10722/330002 |
ISSN | 2023 Impact Factor: 2.7 2023 SCImago Journal Rankings: 0.948 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lei, Yunwen | - |
dc.contributor.author | Zhou, Ding Xuan | - |
dc.date.accessioned | 2023-08-09T03:37:06Z | - |
dc.date.available | 2023-08-09T03:37:06Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | Neural Computation, 2017, v. 29, n. 3, p. 825-860 | - |
dc.identifier.issn | 0899-7667 | - |
dc.identifier.uri | http://hdl.handle.net/10722/330002 | - |
dc.description.abstract | We study the convergence of the online composite mirror descent algorithm, which involves a mirror map to reflect the geometry of the data and a convex objective function consisting of a loss and a regularizer possibly inducing sparsity.Our error analysis provides convergence rates in terms of properties of the strongly convex differentiable mirror map and the objective function. For a class of objective functions with Hölder continuous gradients, the convergence rates of the excess (regularized) risk under polynomially decaying step sizes have the order O(T-1/2 log T) after T iterates. Our results improve the existing error analysis for the online composite mirror descent algorithm by avoiding averaging and removing boundedness assumptions, and they sharpen the existing convergence rates of the last iterate for online gradient descent without any boundedness assumptions. Our methodology mainly depends on a novel error decomposition in terms of an excess Bregman distance, refined analysis of self-bounding properties of the objective function, and the resulting one-step progress bounds. | - |
dc.language | eng | - |
dc.relation.ispartof | Neural Computation | - |
dc.title | Analysis of online composite mirror descent algorithm | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1162/NECO_a_00930 | - |
dc.identifier.pmid | 28095196 | - |
dc.identifier.scopus | eid_2-s2.0-85013675736 | - |
dc.identifier.volume | 29 | - |
dc.identifier.issue | 3 | - |
dc.identifier.spage | 825 | - |
dc.identifier.epage | 860 | - |
dc.identifier.eissn | 1530-888X | - |
dc.identifier.isi | WOS:000395564100010 | - |