Abstract
Recently a robust probabilistic L1-PCA model was introduced in [1] by replacing the conventional Gaussian noise model with the Laplacian L1 model. Due to the heavy tail characteristics of the L1 distribution, the proposed model is more robust against data outliers. In this paper, we generalized the L1-PCA into a mixture of L1-distributions so that the model can be used for possible multiclustering data. For the model learning we use the property that the L1 density can be expanded as a superposition of infinite number of Gaussian densities to include a tractable Bayesian learning and inference based on the variational EM type algorithm.
Original language | English |
---|---|
Pages (from-to) | 26-35 |
Number of pages | 10 |
Journal | Lecture Notes in Computer Science |
Volume | 4830 |
Issue number | 2007 |
DOIs | |
Publication status | Published - 2007 |