File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TNNLS.2025.3545420
- Scopus: eid_2-s2.0-105000426521
- WOS: WOS:001470663200001
- Find via

Supplementary
- Citations:
- Appears in Collections:
Article: Convergence of Adaptive Stochastic Mirror Descent
| Title | Convergence of Adaptive Stochastic Mirror Descent |
|---|---|
| Authors | |
| Keywords | Adam convergence analysis mirror descent nonconvex stochastic optimization |
| Issue Date | 18-Mar-2025 |
| Publisher | Institute of Electrical and Electronics Engineers |
| Citation | IEEE Transactions on Neural Networks and Learning Systems, 2025, p. 1-12 How to Cite? |
| Abstract | In this article, we present a family of adaptive stochastic optimization methods, which are associated with mirror maps that are widely used to capture the geometry properties of optimization problems during iteration processes. The wellknown adaptive moment estimation (Adam)-type algorithm falls into the family when the mirror maps take the form of temporal adaptation. In the context of convex objective functions, we show that with proper step sizes and hyperparameters, the average regret can achieve the convergence rate O(T-(1=2)) after T iterations under some standard assumptions. We further improve it to O(T-1 log T) when the objective functions are strongly convex. In the context of smooth objective functions (not necessarily convex), based on properties of the strongly convex differentiable mirror map, our algorithms achieve convergence rates of order O(T-(1=2)) up to a logarithmic term, requiring large or increasing hyperparameters that are coincident with practical usage of Adam-type algorithms. Thus, our work gives explanations for the selection of the hyperparameters in Adamtype algorithms' implementation. |
| Persistent Identifier | http://hdl.handle.net/10722/357587 |
| ISSN | 2023 Impact Factor: 10.2 2023 SCImago Journal Rankings: 4.170 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Hu, Ting | - |
| dc.contributor.author | Liu, Xiaotong | - |
| dc.contributor.author | Ji, Kai | - |
| dc.contributor.author | Lei, Yunwen | - |
| dc.date.accessioned | 2025-07-22T03:13:40Z | - |
| dc.date.available | 2025-07-22T03:13:40Z | - |
| dc.date.issued | 2025-03-18 | - |
| dc.identifier.citation | IEEE Transactions on Neural Networks and Learning Systems, 2025, p. 1-12 | - |
| dc.identifier.issn | 2162-237X | - |
| dc.identifier.uri | http://hdl.handle.net/10722/357587 | - |
| dc.description.abstract | In this article, we present a family of adaptive stochastic optimization methods, which are associated with mirror maps that are widely used to capture the geometry properties of optimization problems during iteration processes. The wellknown adaptive moment estimation (Adam)-type algorithm falls into the family when the mirror maps take the form of temporal adaptation. In the context of convex objective functions, we show that with proper step sizes and hyperparameters, the average regret can achieve the convergence rate O(T-(1=2)) after T iterations under some standard assumptions. We further improve it to O(T-1 log T) when the objective functions are strongly convex. In the context of smooth objective functions (not necessarily convex), based on properties of the strongly convex differentiable mirror map, our algorithms achieve convergence rates of order O(T-(1=2)) up to a logarithmic term, requiring large or increasing hyperparameters that are coincident with practical usage of Adam-type algorithms. Thus, our work gives explanations for the selection of the hyperparameters in Adamtype algorithms' implementation. | - |
| dc.language | eng | - |
| dc.publisher | Institute of Electrical and Electronics Engineers | - |
| dc.relation.ispartof | IEEE Transactions on Neural Networks and Learning Systems | - |
| dc.subject | Adam | - |
| dc.subject | convergence analysis | - |
| dc.subject | mirror descent | - |
| dc.subject | nonconvex stochastic optimization | - |
| dc.title | Convergence of Adaptive Stochastic Mirror Descent | - |
| dc.type | Article | - |
| dc.identifier.doi | 10.1109/TNNLS.2025.3545420 | - |
| dc.identifier.scopus | eid_2-s2.0-105000426521 | - |
| dc.identifier.spage | 1 | - |
| dc.identifier.epage | 12 | - |
| dc.identifier.eissn | 2162-2388 | - |
| dc.identifier.isi | WOS:001470663200001 | - |
| dc.identifier.issnl | 2162-237X | - |
