2013 OntheEquivalentofLowRankLinearR

From GM-RKB
Jump to navigation Jump to search

Subject Headings:

Notes

Cited By

Quotes

Author Keywords

Abstract

The low-rank regression model has been studied and applied to capture the underlying classes / tasks correlation patterns, such that the regression / classification results can be enhanced. In this paper, we will prove that the low-rank regression model is equivalent to doing linear regression in the linear discriminant analysis (LDA) subspace. Our new theory reveals the learning mechanism of low-rank regression, and shows that the low-rank structures exacted from classes / tasks are connected to the LDA projection results. Thus, the low-rank regression efficiently works for the high-dimensional data.

Moreover, we will propose new discriminant low-rank ridge regression and sparse low-rank regression methods. Both of them are equivalent to doing regularized regression in the regularized LDA subspace. These new regularized objectives provide better data mining results than existing low-rank regression in both theoretical and empirical validations. We evaluate our discriminant low-rank regression methods by six benchmark datasets. In all empirical results, our discriminant low-rank models consistently show better results than the corresponding full-rank methods.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2013 OntheEquivalentofLowRankLinearRChris Ding
Feiping Nie
Heng Huang
Xiao Cai
On the Equivalent of Low-rank Linear Regressions and Linear Discriminant Analysis based Regressions10.1145/2487575.24877012013