File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: An Algorithmic Framework of Generalized Primal–Dual Hybrid Gradient Methods for Saddle Point Problems

TitleAn Algorithmic Framework of Generalized Primal–Dual Hybrid Gradient Methods for Saddle Point Problems
Authors
KeywordsConvex programming
Convergence rate
Image restoration
Primal–dual hybrid gradient method
Saddle point problem
Variational inequalities
Variational models
Issue Date2017
Citation
Journal of Mathematical Imaging and Vision, 2017, v. 58, n. 2, p. 279-293 How to Cite?
Abstract© 2017, Springer Science+Business Media New York. The primal–dual hybrid gradient method (PDHG) originates from the Arrow–Hurwicz method, and it has been widely used to solve saddle point problems, particularly in image processing areas. With the introduction of a combination parameter, Chambolle and Pock proposed a generalized PDHG scheme with both theoretical and numerical advantages. It has been analyzed that except for the special case where the combination parameter is 1, the PDHG cannot be casted to the proximal point algorithm framework due to the lack of symmetry in the matrix associated with the proximal regularization terms. The PDHG scheme is nonsymmetric also in the sense that one variable is updated twice while the other is only updated once at each iteration. These nonsymmetry features also explain why more theoretical issues remain challenging for generalized PDHG schemes; for example, the worst-case convergence rate of PDHG measured by the iteration complexity in a nonergodic sense is still missing. In this paper, we further consider how to generalize the PDHG and propose an algorithmic framework of generalized PDHG schemes for saddle point problems. This algorithmic framework allows the output of the PDHG subroutine to be further updated by correction steps with constant step sizes. We investigate the restriction onto these step sizes and conduct the convergence analysis for the algorithmic framework. The algorithmic framework turns out to include some existing PDHG schemes as special cases, and it immediately yields a class of new generalized PDHG schemes by choosing different step sizes for the correction steps. In particular, a completely symmetric PDHG scheme with the golden-ratio step sizes is included. Theoretically, an advantage of the algorithmic framework is that the worst-case convergence rate measured by the iteration complexity in both the ergodic and nonergodic senses can be established.
Persistent Identifierhttp://hdl.handle.net/10722/251202
ISSN
2023 Impact Factor: 1.3
2023 SCImago Journal Rankings: 0.684
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorHe, Bingsheng-
dc.contributor.authorMa, Feng-
dc.contributor.authorYuan, Xiaoming-
dc.date.accessioned2018-02-01T01:54:53Z-
dc.date.available2018-02-01T01:54:53Z-
dc.date.issued2017-
dc.identifier.citationJournal of Mathematical Imaging and Vision, 2017, v. 58, n. 2, p. 279-293-
dc.identifier.issn0924-9907-
dc.identifier.urihttp://hdl.handle.net/10722/251202-
dc.description.abstract© 2017, Springer Science+Business Media New York. The primal–dual hybrid gradient method (PDHG) originates from the Arrow–Hurwicz method, and it has been widely used to solve saddle point problems, particularly in image processing areas. With the introduction of a combination parameter, Chambolle and Pock proposed a generalized PDHG scheme with both theoretical and numerical advantages. It has been analyzed that except for the special case where the combination parameter is 1, the PDHG cannot be casted to the proximal point algorithm framework due to the lack of symmetry in the matrix associated with the proximal regularization terms. The PDHG scheme is nonsymmetric also in the sense that one variable is updated twice while the other is only updated once at each iteration. These nonsymmetry features also explain why more theoretical issues remain challenging for generalized PDHG schemes; for example, the worst-case convergence rate of PDHG measured by the iteration complexity in a nonergodic sense is still missing. In this paper, we further consider how to generalize the PDHG and propose an algorithmic framework of generalized PDHG schemes for saddle point problems. This algorithmic framework allows the output of the PDHG subroutine to be further updated by correction steps with constant step sizes. We investigate the restriction onto these step sizes and conduct the convergence analysis for the algorithmic framework. The algorithmic framework turns out to include some existing PDHG schemes as special cases, and it immediately yields a class of new generalized PDHG schemes by choosing different step sizes for the correction steps. In particular, a completely symmetric PDHG scheme with the golden-ratio step sizes is included. Theoretically, an advantage of the algorithmic framework is that the worst-case convergence rate measured by the iteration complexity in both the ergodic and nonergodic senses can be established.-
dc.languageeng-
dc.relation.ispartofJournal of Mathematical Imaging and Vision-
dc.subjectConvex programming-
dc.subjectConvergence rate-
dc.subjectImage restoration-
dc.subjectPrimal–dual hybrid gradient method-
dc.subjectSaddle point problem-
dc.subjectVariational inequalities-
dc.subjectVariational models-
dc.titleAn Algorithmic Framework of Generalized Primal–Dual Hybrid Gradient Methods for Saddle Point Problems-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1007/s10851-017-0709-5-
dc.identifier.scopuseid_2-s2.0-85013414571-
dc.identifier.volume58-
dc.identifier.issue2-
dc.identifier.spage279-
dc.identifier.epage293-
dc.identifier.eissn1573-7683-
dc.identifier.isiWOS:000399828500006-
dc.identifier.issnl0924-9907-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats