2000 RegularizationNetworksAndSVMs

From GM-RKB
Jump to: navigation, search

Subject Headings: Regularization, Radial Basis Functions, Support Vector Machines, Reproducing Kernel Hilbert Space, Structural Risk Minimization.

Notes

Cited By

Quotes

Author Keywords

regularization - Radial Basis Functions - Support Vector Machines - Reproducing Kernel Hilbert Space - Structural Risk Minimization

Abstract

Regularization Networks and Support Vector Machines are techniques for solving certain problems of learning from examples – in particular, the regression problem of approximating a multivariate function from sparse data. Radial Basis Functions, for example, are a special case of both regularization and Support Vector Machines. We review both formulations in the context of Vapnik's theory of statistical learning which provides a general foundation for the learning problem, combining functional analysis and statistics. The emphasis is on regression: classification is treated as a special case.

1. Introduction

... Vapnik’s theory characterizes and formalizes these concepts in terms of the capacity of a set of functions and capacity control depending on the training data: for instance, for a small training set the capacity of the function space in which [math]f[/math] is sought has to be small whereas it can increase with a larger training set.

2. Overview of statistical learning theory

... We are provided with examples of this probabilistic relationship, that is with a data set [math]D_l ≡ \{(x_i, y_i) \in X×Y\}^l_{i=1}[/math] called the training data, obtained by sampling [math]l[/math] times the set [math]X × Y[/math] according to [math]P(x, y)[/math].,


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2000 RegularizationNetworksAndSVMsTheodorus Evgeniou
Massimiliano Pontil
Tomaso Poggio
Regularization Networks and Support Vector MachinesAdvances in Computational Mathematicshttp://cbcl.mit.edu/publications/ps/evgeniou-reviewall.pdf2000