Academic Journals Database
Disseminating quality controlled scientific knowledge

Primal and dual model representations in kernel-based learning

ADD TO MY LIST
 
Author(s): Johan A.K. Suykens | Carlos Alzate | Kristiaan Pelckmans

Journal: Statistics Surveys
ISSN 1935-7516

Volume: 4;
Start page: 148;
Date: 2010;
Original page

Keywords: Kernel methods | Support vector machines | Constrained optimization | Primal and dual problem | Feature map | Regression | Classification | Principal component analysis | Spectral clustering | Canonical correlation analysis | Independence | Dimensionality reduction and data visualization | Sparseness | Robustness

ABSTRACT
This paper discusses the role of primal and (Lagrange) dual model representations in problems of supervised and unsupervised learning. The specification of the estimation problem is conceived at the primal level as a constrained optimization problem. The constraints relate to the model which is expressed in terms of the feature map. From the conditions for optimality one jointly finds the optimal model representation and the model estimate. At the dual level the model is expressed in terms of a positive definite kernel function, which is characteristic for a support vector machine methodology. It is discussed how least squares support vector machines are playing a central role as core models across problems of regression, classification, principal component analysis, spectral clustering, canonical correlation analysis, dimensionality reduction and data visualization.
RPA Switzerland

Robotic Process Automation Switzerland

    

Tango Jona
Tangokurs Rapperswil-Jona