2018 SubsampledStochasticVarianceRed

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Stochastic Variance Reduced Gradient Langevin Dynamics Algorithm.

Notes

Cited By

Quotes

Abstract

Stochastic variance-reduced gradient Langevin dynamics (SVRG-LD) was recently proposed to improve the performance of stochastic gradient Langevin dynamics (SGLD) by reducing the variance of the stochastic gradient. In this paper, we propose a variant of SVRG-LD, namely SVRG-LD +, which replaces the full gradient in each epoch with a subsampled one. We provide a nonasymptotic analysis of the convergence of SVRG-LD + in 2-Wasserstein distance, and show that SVRG-LD + enjoys a lower gradient complexity1 than SVRG-LD, when the sample size is large or the target accuracy requirement is moderate. Our analysis directly implies a sharper convergence rate for SVRG-LD, which improves the existing convergence rate by a factor of κ 1/6n 1/6, where κ is the condition number of the log-density function and n is the sample size. Experiments on both synthetic and real-world datasets validate our theoretical results.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2018 SubsampledStochasticVarianceRedQuanquan Gu
Difan Zou
Pan Xu
Subsampled Stochastic Variance-Reduced Gradient Langevin Dynamics