2010 TrainingandTestingLowDegreePoly

From GM-RKB
Jump to: navigation, search

Subject Headings: Low-Degree Polynomial Conjunctions.

Notes

Cited By

Quotes

Abstract

Kernel techniques have long been used in SVM to handle linearly inseparable problems by transforming data to a high dimensional space, but training and testing large data sets is often time consuming. In contrast, we can efficiently train and test much larger data sets using linear SVM without kernels. In this work, we apply fast linear-SVM methods to the explicit form of polynomially mapped data and investigate implementation issues. The approach enjoys fast training and time, but may sometimes achieve accuracy close to that of using highly nonlinear kernels. Empirical experiments show that the proposed method is useful for certain large-scale data sets. We successfully apply the proposed method to a natural language processing (NLP) application by improving the testing accuracy under some training / testing speed requirements.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2010 TrainingandTestingLowDegreePolyYin-Wen Chang
Cho-Jui Hsieh
Kai-Wei Chang
Michael Ringgaard
Chih-Jen Lin
Training and Testing Low-degree Polynomial Data Mappings via Linear SVM2010
AuthorYin-Wen Chang +, Cho-Jui Hsieh +, Kai-Wei Chang +, Michael Ringgaard + and Chih-Jen Lin +
titleTraining and Testing Low-degree Polynomial Data Mappings via Linear SVM +
year2010 +