TY - JOUR
T1 - Supervised latent linear Gaussian Process Latent Variable Model for Dimensionality Reduction
AU - Jiang, Xinwei
AU - Gao, Junbin
AU - Wang, Tianjiang
AU - Zheng, Lihong
N1 - Imported on 12 Apr 2017 - DigiTool details were: month (773h) = December, 2012; Journal title (773t) = IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics. ISSNs: 1083-4419;
PY - 2012/12
Y1 - 2012/12
N2 - The Gaussian process (GP) latent variable model(GPLVM) has the capability of learning low-dimensional manifoldfrom highly nonlinear data of high dimensionality. As an unsuperviseddimensionality reduction (DR) algorithm, the GPLVM hasbeen successfully applied in many areas. However, in its currentsetting, GPLVM is unable to use label information, which is availablefor many tasks; therefore, researchers proposed many kindsof extensions to the GPLVM in order to utilize extra information,among which the supervised GPLVM (SGPLVM) has shown betterperformance compared with other SGPLVM extensions. However,the SGPLVM suffers in its high computational complexity.Bearing in mind the issues of the complexity and the need ofincorporating additionally available information, in this paper, wepropose a novel SGPLVM, called supervised latent linear GPLVM(SLLGPLVM). Our approach is motivated by both SGPLVM andsupervised probabilistic principal component analysis (SPPCA).The proposed SLLGPLVM can be viewed as an appropriate compromisebetween the SGPLVMand the SPPCA. Furthermore, it isalso appropriate to interpret the SLLGPLVMas a semiparametricregression model for supervised DR by making use of the GP tomodel the unknown smooth link function. Complexity analysis andexperiments show that the developed SLLGPLVM outperformsthe SGPLVMnot only in the computational complexity but also inits accuracy.We also compared the SLLGPLVM with two classicalsupervised classifiers, i.e., a GP classifier and a support vectormachine, to illustrate the advantages of the proposed model.
AB - The Gaussian process (GP) latent variable model(GPLVM) has the capability of learning low-dimensional manifoldfrom highly nonlinear data of high dimensionality. As an unsuperviseddimensionality reduction (DR) algorithm, the GPLVM hasbeen successfully applied in many areas. However, in its currentsetting, GPLVM is unable to use label information, which is availablefor many tasks; therefore, researchers proposed many kindsof extensions to the GPLVM in order to utilize extra information,among which the supervised GPLVM (SGPLVM) has shown betterperformance compared with other SGPLVM extensions. However,the SGPLVM suffers in its high computational complexity.Bearing in mind the issues of the complexity and the need ofincorporating additionally available information, in this paper, wepropose a novel SGPLVM, called supervised latent linear GPLVM(SLLGPLVM). Our approach is motivated by both SGPLVM andsupervised probabilistic principal component analysis (SPPCA).The proposed SLLGPLVM can be viewed as an appropriate compromisebetween the SGPLVMand the SPPCA. Furthermore, it isalso appropriate to interpret the SLLGPLVMas a semiparametricregression model for supervised DR by making use of the GP tomodel the unknown smooth link function. Complexity analysis andexperiments show that the developed SLLGPLVM outperformsthe SGPLVMnot only in the computational complexity but also inits accuracy.We also compared the SLLGPLVM with two classicalsupervised classifiers, i.e., a GP classifier and a support vectormachine, to illustrate the advantages of the proposed model.
KW - Classification
KW - Dimensionality reduction (DR)
KW - Latent variable models (LVMs)
KW - Supervised learning
U2 - 10.1109/TSMCB.2012.2196995
DO - 10.1109/TSMCB.2012.2196995
M3 - Article
SN - 1083-4419
VL - 42
SP - 1620
EP - 1632
JO - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
JF - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IS - 6
ER -