File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Variance Reduced Random Relaxed Projection Method for Constrained Finite-Sum Minimization Problems

TitleVariance Reduced Random Relaxed Projection Method for Constrained Finite-Sum Minimization Problems
Authors
KeywordsConstrained optimization
convergence rate analysis
finite-sum minimization
random projection method
relaxed projection
variance reduction
Issue Date9-Apr-2024
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Signal Processing, 2024, v. 72, p. 2188-2203 How to Cite?
Abstract

For many applications in signal processing and machine learning, we are tasked with minimizing a large sum of convex functions subject to a large number of convex constraints. In this paper, we devise a new random projection method (RPM) to efficiently solve this problem. Compared with existing RPMs, our proposed algorithm features two useful algorithmic ideas. First, at each iteration, instead of projecting onto the set defined by one of the constraints, our algorithm only requires projecting onto a half-space approximation of the set, which significantly reduces the computational cost as it admits a closed-form formula. Second, to exploit the structure that the objective is a sum, variance reduction is incorporated into our algorithm to further improve the performance. As theoretical contributions, under a novel error bound condition and other standard assumptions, we prove that the proposed RPM converges to an optimal solution and that both optimality and feasibility gaps vanish at a sublinear rate. In particular, via a new analysis framework, we show that our RPM attains a faster convergence rate in optimality gap than existing RPMs when the objective function has a Lipschitz continuous gradient, capitalizing the benefit of the variance reduction. We also provide sufficient conditions for the error bound condition to hold. Experiments on a beamforming problem and a robust classification problem are presented to demonstrate the superiority of our RPM over existing ones.


Persistent Identifierhttp://hdl.handle.net/10722/346082
ISSN
2023 Impact Factor: 4.6
2023 SCImago Journal Rankings: 2.520

 

DC FieldValueLanguage
dc.contributor.authorYang, Zhichun-
dc.contributor.authorXia, Fu Quan-
dc.contributor.authorTu, Kai-
dc.contributor.authorYue, Man Chung-
dc.date.accessioned2024-09-07T00:30:30Z-
dc.date.available2024-09-07T00:30:30Z-
dc.date.issued2024-04-09-
dc.identifier.citationIEEE Transactions on Signal Processing, 2024, v. 72, p. 2188-2203-
dc.identifier.issn1053-587X-
dc.identifier.urihttp://hdl.handle.net/10722/346082-
dc.description.abstract<p>For many applications in signal processing and machine learning, we are tasked with minimizing a large sum of convex functions subject to a large number of convex constraints. In this paper, we devise a new random projection method (RPM) to efficiently solve this problem. Compared with existing RPMs, our proposed algorithm features two useful algorithmic ideas. First, at each iteration, instead of projecting onto the set defined by one of the constraints, our algorithm only requires projecting onto a half-space approximation of the set, which significantly reduces the computational cost as it admits a closed-form formula. Second, to exploit the structure that the objective is a sum, variance reduction is incorporated into our algorithm to further improve the performance. As theoretical contributions, under a novel error bound condition and other standard assumptions, we prove that the proposed RPM converges to an optimal solution and that both optimality and feasibility gaps vanish at a sublinear rate. In particular, via a new analysis framework, we show that our RPM attains a faster convergence rate in optimality gap than existing RPMs when the objective function has a Lipschitz continuous gradient, capitalizing the benefit of the variance reduction. We also provide sufficient conditions for the error bound condition to hold. Experiments on a beamforming problem and a robust classification problem are presented to demonstrate the superiority of our RPM over existing ones.</p>-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Transactions on Signal Processing-
dc.subjectConstrained optimization-
dc.subjectconvergence rate analysis-
dc.subjectfinite-sum minimization-
dc.subjectrandom projection method-
dc.subjectrelaxed projection-
dc.subjectvariance reduction-
dc.titleVariance Reduced Random Relaxed Projection Method for Constrained Finite-Sum Minimization Problems-
dc.typeArticle-
dc.identifier.doi10.1109/TSP.2024.3386388-
dc.identifier.scopuseid_2-s2.0-85190168338-
dc.identifier.volume72-
dc.identifier.spage2188-
dc.identifier.epage2203-
dc.identifier.eissn1941-0476-
dc.identifier.issnl1053-587X-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats