2013 ConstrainedStochasticGradientDe

From GM-RKB
Jump to navigation Jump to search

Subject Headings:

Notes

Cited By

Quotes

Author Keywords

Abstract

The least squares problem is one of the most important regression problems in statistics, machine learning and data mining. In this paper, we present the Constrained Stochastic Gradient Descent (CSGD) algorithm to solve the large-scale least squares problem. CSGD improves the Stochastic Gradient Descent (SGD) by imposing a provable constraint that the linear regression line passes through the mean point of all the data points. It results in the best regret bound $O (\ log{T}) $, and fastest convergence speed among all first order approaches. Empirical studies justify the effectiveness of CSGD by comparing it with SGD and other state-of-the-art approaches. An example is also given to show how to use CSGD to optimize SGD based least squares problems to achieve a better performance.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2013 ConstrainedStochasticGradientDeDacheng Tao
Wei Ding
Yang Mu
Tianyi Zhou
Constrained Stochastic Gradient Descent for Large-scale Least Squares Problem10.1145/2487575.24876352013