2015 ReducingtheUnlabeledSampleCompl

From GM-RKB
Jump to navigation Jump to search

Subject Headings:

Notes

Cited By

Quotes

Author Keywords

Abstract

In semi-supervised multi-view learning, unlabeled sample complexity (u.s.c.) specifies the size of unlabeled training sample that guarantees a desired learning error. In this paper, we improve the state-of-art u.s.c. from O (1/ε) to O (log 1/ε) for small error ε, under mild conditions. To obtain the improved result, as a primary step we prove a connection between the generalization error of a classifier and its incompatibility, which measures the fitness between the classifier and the sample distribution. We then prove that with a sufficiently large unlabeled sample, one is able to find classifiers with low incompatibility. Combining the two observations, we manage to prove a probably approximately correct (PAC) style learning bound for semi-supervised multi-view learning. We empirically verified our theory by designing two proof-of-concept multi-view learning algorithms, one based on active view sensing and the other based on online co-regularization, with real-world data sets.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2015 ReducingtheUnlabeledSampleComplJun Huan
Chao Lan
Reducing the Unlabeled Sample Complexity of Semi-Supervised Multi-View Learning10.1145/2783258.27834092015