File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: On the convergence of primal-dual hybrid gradient algorithm

TitleOn the convergence of primal-dual hybrid gradient algorithm
Authors
KeywordsPrimal-dual hybrid gradient algorithm
Saddle-point problem
Total variation
Convergence rate
Image restoration
Convex optimization
Issue Date2014
PublisherSociety for Industrial and Applied Mathematics. The Journal's web site is located at http://www.siam.org/journals/siims.php
Citation
SIAM Journal on Imaging Sciences, 2014, v. 7, n. 4, p. 2526-2537 How to Cite?
Abstract© 2014 Society for Industrial and Applied Mathematics. The primal-dual hybrid gradient algorithm (PDHG) has been widely used, especially for some basic image processing models. In the literature, PDHGâ s convergence was established only under some restrictive conditions on its step sizes. In this paper, we revisit PDHGâ s convergence in the context of a saddle-point problem and try to better understand how to choose its step sizes. More specifically, we show by an extremely simple example that PDHG is not necessarily convergent even when the step sizes are fixed as tiny constants. We then show that PDHG with constant step sizes is indeed convergent if one of the functions of the saddle-point problem is strongly convex, a condition that does hold for some variational models in imaging. With this additional condition, we also establish a worst-case convergence rate measured by the iteration complexity for PDHG with constant step sizes.
Persistent Identifierhttp://hdl.handle.net/10722/251082
ISSN
2020 Impact Factor: 2.867
2020 SCImago Journal Rankings: 0.944
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorHe, Bingsheng-
dc.contributor.authorYou, Yanfei-
dc.contributor.authorYuan, Xiaoming-
dc.date.accessioned2018-02-01T01:54:31Z-
dc.date.available2018-02-01T01:54:31Z-
dc.date.issued2014-
dc.identifier.citationSIAM Journal on Imaging Sciences, 2014, v. 7, n. 4, p. 2526-2537-
dc.identifier.issn1936-4954-
dc.identifier.urihttp://hdl.handle.net/10722/251082-
dc.description.abstract© 2014 Society for Industrial and Applied Mathematics. The primal-dual hybrid gradient algorithm (PDHG) has been widely used, especially for some basic image processing models. In the literature, PDHGâ s convergence was established only under some restrictive conditions on its step sizes. In this paper, we revisit PDHGâ s convergence in the context of a saddle-point problem and try to better understand how to choose its step sizes. More specifically, we show by an extremely simple example that PDHG is not necessarily convergent even when the step sizes are fixed as tiny constants. We then show that PDHG with constant step sizes is indeed convergent if one of the functions of the saddle-point problem is strongly convex, a condition that does hold for some variational models in imaging. With this additional condition, we also establish a worst-case convergence rate measured by the iteration complexity for PDHG with constant step sizes.-
dc.languageeng-
dc.publisherSociety for Industrial and Applied Mathematics. The Journal's web site is located at http://www.siam.org/journals/siims.php-
dc.relation.ispartofSIAM Journal on Imaging Sciences-
dc.subjectPrimal-dual hybrid gradient algorithm-
dc.subjectSaddle-point problem-
dc.subjectTotal variation-
dc.subjectConvergence rate-
dc.subjectImage restoration-
dc.subjectConvex optimization-
dc.titleOn the convergence of primal-dual hybrid gradient algorithm-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1137/140963467-
dc.identifier.scopuseid_2-s2.0-84919665374-
dc.identifier.volume7-
dc.identifier.issue4-
dc.identifier.spage2526-
dc.identifier.epage2537-
dc.identifier.isiWOS:000346854900021-
dc.identifier.issnl1936-4954-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats