Mixture of the Robust L1 Distributions and Its Applications

Junbin Gao, Yi Da Xu

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

Recently a robust probabilistic L1-PCA model was introduced in [1] by replacing the conventional Gaussian noise model with the Laplacian L1 model. Due to the heavy tail characteristics of the L1 distribution, the proposed model is more robust against data outliers. In this paper, we generalized the L1-PCA into a mixture of L1-distributions so that the model can be used for possible multiclustering data. For the model learning we use the property that the L1 density can be expanded as a superposition of infinite number of Gaussian densities to include a tractable Bayesian learning and inference based on the variational EM type algorithm.
Original languageEnglish
Pages (from-to)26-35
Number of pages10
JournalLecture Notes in Computer Science
Volume4830
Issue number2007
DOIs
Publication statusPublished - 2007

Fingerprint

Model
Bayesian Learning
Heavy Tails
Gaussian Noise
Bayesian inference
Outlier
Superposition
Learning

Cite this

Gao, Junbin ; Xu, Yi Da. / Mixture of the Robust L1 Distributions and Its Applications. In: Lecture Notes in Computer Science. 2007 ; Vol. 4830, No. 2007. pp. 26-35.
@article{322a42026ad6438d8516ea85cad0a4ff,
title = "Mixture of the Robust L1 Distributions and Its Applications",
abstract = "Recently a robust probabilistic L1-PCA model was introduced in [1] by replacing the conventional Gaussian noise model with the Laplacian L1 model. Due to the heavy tail characteristics of the L1 distribution, the proposed model is more robust against data outliers. In this paper, we generalized the L1-PCA into a mixture of L1-distributions so that the model can be used for possible multiclustering data. For the model learning we use the property that the L1 density can be expanded as a superposition of infinite number of Gaussian densities to include a tractable Bayesian learning and inference based on the variational EM type algorithm.",
keywords = "L1 Model, L1 PCA",
author = "Junbin Gao and Xu, {Yi Da}",
note = "Imported on 12 Apr 2017 - DigiTool details were: Journal title (773t) = Lecture Notes in Computer Science. ISSNs: 0302-9743;",
year = "2007",
doi = "10.1007/978-3-540-76928-6_5",
language = "English",
volume = "4830",
pages = "26--35",
journal = "Lecture Notes in Computer Science",
issn = "0302-9743",
publisher = "Springer-Verlag London Ltd.",
number = "2007",

}

Mixture of the Robust L1 Distributions and Its Applications. / Gao, Junbin; Xu, Yi Da.

In: Lecture Notes in Computer Science, Vol. 4830, No. 2007, 2007, p. 26-35.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Mixture of the Robust L1 Distributions and Its Applications

AU - Gao, Junbin

AU - Xu, Yi Da

N1 - Imported on 12 Apr 2017 - DigiTool details were: Journal title (773t) = Lecture Notes in Computer Science. ISSNs: 0302-9743;

PY - 2007

Y1 - 2007

N2 - Recently a robust probabilistic L1-PCA model was introduced in [1] by replacing the conventional Gaussian noise model with the Laplacian L1 model. Due to the heavy tail characteristics of the L1 distribution, the proposed model is more robust against data outliers. In this paper, we generalized the L1-PCA into a mixture of L1-distributions so that the model can be used for possible multiclustering data. For the model learning we use the property that the L1 density can be expanded as a superposition of infinite number of Gaussian densities to include a tractable Bayesian learning and inference based on the variational EM type algorithm.

AB - Recently a robust probabilistic L1-PCA model was introduced in [1] by replacing the conventional Gaussian noise model with the Laplacian L1 model. Due to the heavy tail characteristics of the L1 distribution, the proposed model is more robust against data outliers. In this paper, we generalized the L1-PCA into a mixture of L1-distributions so that the model can be used for possible multiclustering data. For the model learning we use the property that the L1 density can be expanded as a superposition of infinite number of Gaussian densities to include a tractable Bayesian learning and inference based on the variational EM type algorithm.

KW - L1 Model

KW - L1 PCA

U2 - 10.1007/978-3-540-76928-6_5

DO - 10.1007/978-3-540-76928-6_5

M3 - Article

VL - 4830

SP - 26

EP - 35

JO - Lecture Notes in Computer Science

JF - Lecture Notes in Computer Science

SN - 0302-9743

IS - 2007

ER -