2017 FastStochasticVarianceReducedGr

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Fast Stochastic Variance Reduced Gradient Algorithm.

Notes

Cited By

Quotes

Author Keywords

Abstract

Recently, research on accelerated stochastic gradient descent methods (e.g., SVRG) has made exciting progress (e.g., linear convergence for strongly convex problems). However, the best-known methods (e.g., Katyusha) requires at least two auxiliary variables and two momentum parameters. In this paper, we propose a fast stochastic variance reduction gradient (FSVRG) method, in which we design a novel update rule with the Nesterov's momentum and incorporate the technique of growing epoch size. FSVRG has only one auxiliary variable and one momentum weight, and thus it is much simpler and has much lower per-iteration complexity. We prove that FSVRG achieves linear convergence for strongly convex problems and the optimal O (1/T2) convergence rate for non-strongly convex problems, where T is the number of outer-iterations. We also extend FSVRG to directly solve the problems with non-smooth component functions, such as SVM. Finally, we empirically study the performance of FSVRG for solving various machine learning problems such as logistic regression, ridge regression, Lasso and SVM. Our results show that FSVRG outperforms the state-of-the-art stochastic methods, including Katyusha.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2017 FastStochasticVarianceReducedGrJames Cheng
Fanhua Shang
Yuanyuan Liu
Jiacheng Zhuo
Fast Stochastic Variance Reduced Gradient Method with Momentum Acceleration for Machine Learning