2006 PrincipledHybridsofGenerativean

Jump to navigation Jump to search

Subject Headings: Generative/Discriminative Hybrid Algorithm.


Cited By



When labelled training data is plentiful, discriminative techniques are widely used since they give excellent generalization performance. However, for large-scale applications such as object recognition, hand labelling of data is expensive, and there is much interest in semi-supervised techniques based on generative models in which the majority of the training data is unlabelled. Although the generalization performance of generative models can often be improved by 'training them discriminatively', they can then no longer make use of unlabelled data. In an attempt to gain the benefit of both generative and discriminative approaches, heuristic procedure have been proposed [2,3] which interpolate between these two extremes by taking a convex combination of the generative and discriminative objective functions. In this paper we adopt a new perspective which says that there is only one correct way to train a given model, and that a 'discriminatively trained' generative model is fundamentally a new model [ 7]. From this viewpoint, generative and discriminative models correspond to specific choices for the prior over parameters. As well as giving a principled interpretation of 'discriminative training', this approach opens door to very general ways of interpolating between generative and discriminative extremes through alternative choices of prior. We illustrate this framework using both synthetic data and a practical example in the domain of multi-class object recognition. Our results show that, when the supply of labelled training data is limited, the optimum performance corresponds to a balance between the purely generative and the purely discriminative.



 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2006 PrincipledHybridsofGenerativeanChristopher M. Bishop
Thomas P. Minka
Julia A. Lasserre
Principled Hybrids of Generative and Discriminative Models10.1109/CVPR.2006.2272006