File Download
Links for fulltext
(May Require Subscription)
- Scopus: eid_2-s2.0-85059342905
- WOS: WOS:000493119200050
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Subsampled stochastic variance-reduced gradient langevin dynamics
Title | Subsampled stochastic variance-reduced gradient langevin dynamics |
---|---|
Authors | |
Issue Date | 2018 |
Citation | 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018, 2018, v. 1, p. 508-518 How to Cite? |
Abstract | Stochastic variance-reduced gradient Langevin dynamics (SVRG-LD) was recently proposed to improve the performance of stochastic gradient Langevin dynamics (SGLD) by reducing the variance of the stochastic gradient. In this paper, we propose a variant of SVRG-LD, namely SVRG-LD + , which replaces the full gradient in each epoch with a subsampled one. We provide a nonasymptotic analysis of the convergence of SVRG-LD + in 2-Wasserstein distance, and show that SVRG-LD + enjoys a lower gradient complexity 1 than SVRG-LD, when the sample size is large or the target accuracy requirement is moderate. Our analysis directly implies a sharper convergence rate for SVRG-LD, which improves the existing convergence rate by a factor of κ 1/6 n 1/6 , where κ is the condition number of the log-density function and n is the sample size. Experiments on both synthetic and real-world datasets validate our theoretical results. |
Persistent Identifier | http://hdl.handle.net/10722/316608 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zou, Difan | - |
dc.contributor.author | Xu, Pan | - |
dc.contributor.author | Gu, Quanquan | - |
dc.date.accessioned | 2022-09-14T11:40:52Z | - |
dc.date.available | 2022-09-14T11:40:52Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018, 2018, v. 1, p. 508-518 | - |
dc.identifier.uri | http://hdl.handle.net/10722/316608 | - |
dc.description.abstract | Stochastic variance-reduced gradient Langevin dynamics (SVRG-LD) was recently proposed to improve the performance of stochastic gradient Langevin dynamics (SGLD) by reducing the variance of the stochastic gradient. In this paper, we propose a variant of SVRG-LD, namely SVRG-LD + , which replaces the full gradient in each epoch with a subsampled one. We provide a nonasymptotic analysis of the convergence of SVRG-LD + in 2-Wasserstein distance, and show that SVRG-LD + enjoys a lower gradient complexity 1 than SVRG-LD, when the sample size is large or the target accuracy requirement is moderate. Our analysis directly implies a sharper convergence rate for SVRG-LD, which improves the existing convergence rate by a factor of κ 1/6 n 1/6 , where κ is the condition number of the log-density function and n is the sample size. Experiments on both synthetic and real-world datasets validate our theoretical results. | - |
dc.language | eng | - |
dc.relation.ispartof | 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018 | - |
dc.title | Subsampled stochastic variance-reduced gradient langevin dynamics | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_OA_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85059342905 | - |
dc.identifier.volume | 1 | - |
dc.identifier.spage | 508 | - |
dc.identifier.epage | 518 | - |
dc.identifier.isi | WOS:000493119200050 | - |