Files

Download

Download Full Text (635 KB)

Abstract

Background

Conventional Principal Component Analysis (PCA) is a widely used technique to reduce data dimension. PCA finds linear combinations of the original features capturing maximal variance of data via Singular Value Decomposition (SVD). However, SVD is sensitive to outliers, and often leads to high dimensional results. To address the issues, we propose a new method to estimate best-fit one-dimensional subspace, called l1-norm Regularized l1-norm.

Methods

In this article, we describe a method to fit a lower-dimensional subspace by approximate a non-linear, non-convex, non-smooth optimization problem called l1 regularized l1-norm Best- Fit Line problem; minimize a combination of the l1 error and of the l1 regularization. The procedure can be simply performed using ratios and sorting. Also ,we present applications in the area of video surveillance, where our methodology allows for background subtraction with jitters, illumination changes, and clutters.

Results

We compared our performance with SVD on synthetic data. The numerical results showed our algorithm successfully found a better principal component from a grossly corrupted data than SVD in terms of discordance. Moreover, our algorithm provided a sparser principal component than SVD. However, we expect it to be faster on multi-node environment.

Conclusions

This paper proposes a new algorithm able to generate a sparse best-fit subspace robust to outliers. The projected subspaces sought on non-contaminated data, differ little from that of traditional PCA. When subspaces are projected from contaminated data, it attain arguably significant both smaller discordance and lower dimension than that of traditional PCA.

Publication Date

2020

Disciplines

Applied Mathematics

Is Part Of

VCU Graduate Research Posters

L1-norm Regularized L1-norm Best-fit line problem

Share

COinS