Abstract
We introduce a robust probabilistic L1-PCA model in which the conventional gaussian distribution for the noise in the observed data was replaced by the Laplacian distribution (or L1 distribution). Due to the heavy tail characteristics of the L1 distribution, the proposed model is supposed to be more robust against data outliers. In this letter, we demonstrate how a variational approximation scheme enables effective inference of key parameters in the probabilistic L1-PCA model. As the L1 density can be expanded as a superposition of infinite number of gaussian densities, we express the L1-PCA model as a marginalized model over the superpositions. By doing so, a tractable Bayesian inference can be achieved based on the variational expectation-maximization-type algorithm.
Original language | English |
---|---|
Pages (from-to) | 555-572 |
Number of pages | 18 |
Journal | Neural Computation |
Volume | 20 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2008 |