Academic Journals Database
Disseminating quality controlled scientific knowledge

ℓ1 Major Component Detection and Analysis (ℓ1 MCDA): Foundations in Two Dimensions

ADD TO MY LIST
 
Author(s): Ye Tian | Qingwei Jin | John E. Lavery | Shu-Cherng Fang

Journal: Algorithms
ISSN 1999-4893

Volume: 6;
Issue: 1;
Start page: 12;
Date: 2013;
Original page

Keywords: heavy-tailed distribution | &#8467 | 1 | &#8467 | 2 | major component | multivariate statistics | outliers | principal component analysis | 2D

ABSTRACT
Principal Component Analysis (PCA) is widely used for identifying the major components of statistically distributed point clouds. Robust versions of PCA, often based in part on the ℓ1 norm (rather than the ℓ2 norm), are increasingly used, especially for point clouds with many outliers. Neither standard PCA nor robust PCAs can provide, without additional assumptions, reliable information for outlier-rich point clouds and for distributions with several main directions (spokes). We carry out a fundamental and complete reformulation of the PCA approach in a framework based exclusively on the ℓ1 norm and heavy-tailed distributions. The ℓ1 Major Component Detection and Analysis (ℓ1 MCDA) that we propose can determine the main directions and the radial extent of 2D data from single or multiple superimposed Gaussian or heavy-tailed distributions without and with patterned artificial outliers (clutter). In nearly all cases in the computational results, 2D ℓ1 MCDA has accuracy superior to that of standard PCA and of two robust PCAs, namely, the projection-pursuit method of Croux and Ruiz-Gazen and the ℓ1 factorization method of Ke and Kanade. (Standard PCA is, of course, superior to ℓ1 MCDA for Gaussian-distributed point clouds.) The computing time of ℓ1 MCDA is competitive with the computing times of the two robust PCAs.
RPA Switzerland

Robotic Process Automation Switzerland

    

Tango Jona
Tangokurs Rapperswil-Jona