File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Convergence of online mirror descent

TitleConvergence of online mirror descent
Authors
KeywordsBregman distance
Convergence analysis
Learning theory
Mirror descent
Online learning
Issue Date2020
Citation
Applied and Computational Harmonic Analysis, 2020, v. 48, n. 1, p. 343-373 How to Cite?
AbstractIn this paper we consider online mirror descent (OMD), a class of scalable online learning algorithms exploiting data geometric structures through mirror maps. Necessary and sufficient conditions are presented in terms of the step size sequence {ηt}t for the convergence of OMD with respect to the expected Bregman distance induced by the mirror map. The condition is limt→∞⁡ηt=0,∑t=1∞ηt=∞ in the case of positive variances. It is reduced to ∑t=1∞ηt=∞ in the case of zero variance for which linear convergence may be achieved by taking a constant step size sequence. A sufficient condition on the almost sure convergence is also given. We establish tight error bounds under mild conditions on the mirror map, the loss function, and the regularizer. Our results are achieved by some novel analysis on the one-step progress of OMD using smoothness and strong convexity of the mirror map and the loss function.
Persistent Identifierhttp://hdl.handle.net/10722/329838
ISSN
2023 Impact Factor: 2.6
2023 SCImago Journal Rankings: 2.231
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLei, Yunwen-
dc.contributor.authorZhou, Ding Xuan-
dc.date.accessioned2023-08-09T03:35:42Z-
dc.date.available2023-08-09T03:35:42Z-
dc.date.issued2020-
dc.identifier.citationApplied and Computational Harmonic Analysis, 2020, v. 48, n. 1, p. 343-373-
dc.identifier.issn1063-5203-
dc.identifier.urihttp://hdl.handle.net/10722/329838-
dc.description.abstractIn this paper we consider online mirror descent (OMD), a class of scalable online learning algorithms exploiting data geometric structures through mirror maps. Necessary and sufficient conditions are presented in terms of the step size sequence {ηt}t for the convergence of OMD with respect to the expected Bregman distance induced by the mirror map. The condition is limt→∞⁡ηt=0,∑t=1∞ηt=∞ in the case of positive variances. It is reduced to ∑t=1∞ηt=∞ in the case of zero variance for which linear convergence may be achieved by taking a constant step size sequence. A sufficient condition on the almost sure convergence is also given. We establish tight error bounds under mild conditions on the mirror map, the loss function, and the regularizer. Our results are achieved by some novel analysis on the one-step progress of OMD using smoothness and strong convexity of the mirror map and the loss function.-
dc.languageeng-
dc.relation.ispartofApplied and Computational Harmonic Analysis-
dc.subjectBregman distance-
dc.subjectConvergence analysis-
dc.subjectLearning theory-
dc.subjectMirror descent-
dc.subjectOnline learning-
dc.titleConvergence of online mirror descent-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1016/j.acha.2018.05.005-
dc.identifier.scopuseid_2-s2.0-85047642697-
dc.identifier.volume48-
dc.identifier.issue1-
dc.identifier.spage343-
dc.identifier.epage373-
dc.identifier.eissn1096-603X-
dc.identifier.isiWOS:000492487800013-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats