Supervised latent linear Gaussian Process Latent Variable Model for Dimensionality Reduction

Xinwei Jiang, Junbin Gao, Tianjiang Wang, Lihong Zheng

Research output: Contribution to journalArticlepeer-review

30 Citations (Scopus)
7 Downloads (Pure)


The Gaussian process (GP) latent variable model(GPLVM) has the capability of learning low-dimensional manifoldfrom highly nonlinear data of high dimensionality. As an unsuperviseddimensionality reduction (DR) algorithm, the GPLVM hasbeen successfully applied in many areas. However, in its currentsetting, GPLVM is unable to use label information, which is availablefor many tasks; therefore, researchers proposed many kindsof extensions to the GPLVM in order to utilize extra information,among which the supervised GPLVM (SGPLVM) has shown betterperformance compared with other SGPLVM extensions. However,the SGPLVM suffers in its high computational complexity.Bearing in mind the issues of the complexity and the need ofincorporating additionally available information, in this paper, wepropose a novel SGPLVM, called supervised latent linear GPLVM(SLLGPLVM). Our approach is motivated by both SGPLVM andsupervised probabilistic principal component analysis (SPPCA).The proposed SLLGPLVM can be viewed as an appropriate compromisebetween the SGPLVMand the SPPCA. Furthermore, it isalso appropriate to interpret the SLLGPLVMas a semiparametricregression model for supervised DR by making use of the GP tomodel the unknown smooth link function. Complexity analysis andexperiments show that the developed SLLGPLVM outperformsthe SGPLVMnot only in the computational complexity but also inits accuracy.We also compared the SLLGPLVM with two classicalsupervised classifiers, i.e., a GP classifier and a support vectormachine, to illustrate the advantages of the proposed model.
Original languageEnglish
Pages (from-to)1620-1632
Number of pages13
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Issue number6
Publication statusPublished - Dec 2012


Dive into the research topics of 'Supervised latent linear Gaussian Process Latent Variable Model for Dimensionality Reduction'. Together they form a unique fingerprint.

Cite this