File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Learning theory of randomized sparse Kaczmarz method

TitleLearning theory of randomized sparse Kaczmarz method
Authors
KeywordsBregman distance
Learning theory
Linearized Bregman iteration
Online learning
Randomized sparse Kaczmarz algorithm
Issue Date2018
Citation
SIAM Journal on Imaging Sciences, 2018, v. 11, n. 1, p. 547-574 How to Cite?
AbstractIn this paper we propose an online learning algorithm, a general randomized sparse Kaczmarz method, for generating sparse approximate solutions to linear systems and present learning theory analysis for its convergence. Under a mild assumption covering the case of noisy random measurements in the sampling process or nonlinear regression function, we show that the algorithm converges in expectation if and only if the step size sequence {ηt}tεℕ satisfies limt→∞ηt= 0 and ∑∞t=1ηt = ∞. Convergence rates are also obtained and linear convergence is shown to be impossible under the assumption of positive variance of the sampling process. A sufficient condition for almost sure convergence is derived with an additional restriction ∑∞t=1η2t<∞. Our novel analysis is performed by interpreting the randomized sparse Kaczmarz method as a special online mirror descent algorithm with a nondifferentiable mirror map and using the Bregman distance. The sufficient and necessary conditions are derived by establishing a restricted variant of strong convexity for the involved generalization error and using the special structures of the soft-thresholding operator.
Persistent Identifierhttp://hdl.handle.net/10722/329502

 

DC FieldValueLanguage
dc.contributor.authorLei, Yunwen-
dc.contributor.authorZhou, Ding Xuan-
dc.date.accessioned2023-08-09T03:33:15Z-
dc.date.available2023-08-09T03:33:15Z-
dc.date.issued2018-
dc.identifier.citationSIAM Journal on Imaging Sciences, 2018, v. 11, n. 1, p. 547-574-
dc.identifier.urihttp://hdl.handle.net/10722/329502-
dc.description.abstractIn this paper we propose an online learning algorithm, a general randomized sparse Kaczmarz method, for generating sparse approximate solutions to linear systems and present learning theory analysis for its convergence. Under a mild assumption covering the case of noisy random measurements in the sampling process or nonlinear regression function, we show that the algorithm converges in expectation if and only if the step size sequence {ηt}tεℕ satisfies limt→∞ηt= 0 and ∑∞t=1ηt = ∞. Convergence rates are also obtained and linear convergence is shown to be impossible under the assumption of positive variance of the sampling process. A sufficient condition for almost sure convergence is derived with an additional restriction ∑∞t=1η2t<∞. Our novel analysis is performed by interpreting the randomized sparse Kaczmarz method as a special online mirror descent algorithm with a nondifferentiable mirror map and using the Bregman distance. The sufficient and necessary conditions are derived by establishing a restricted variant of strong convexity for the involved generalization error and using the special structures of the soft-thresholding operator.-
dc.languageeng-
dc.relation.ispartofSIAM Journal on Imaging Sciences-
dc.subjectBregman distance-
dc.subjectLearning theory-
dc.subjectLinearized Bregman iteration-
dc.subjectOnline learning-
dc.subjectRandomized sparse Kaczmarz algorithm-
dc.titleLearning theory of randomized sparse Kaczmarz method-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1137/17M1136225-
dc.identifier.scopuseid_2-s2.0-85045668273-
dc.identifier.volume11-
dc.identifier.issue1-
dc.identifier.spage547-
dc.identifier.epage574-
dc.identifier.eissn1936-4954-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats