Sparse kernel learning with LASSO and Bayesian inference algorithm

Junbin Gao, Paul W. Kwan, Daming Shi

    Research output: Contribution to journalArticlepeer-review

    114 Citations (Scopus)
    362 Downloads (Pure)

    Abstract

    Kernelized LASSO (Least Absolute Selection and Shrinkage Operator) has been investigated in two separate recent papers [Gao, J., Antolovich, M., & Kwan, P. H. (2008). L1 LASSO and its Bayesian inference. In W. Wobcke, & M. Zhang (Eds.), Lecture notes in computer science: Vol. 5360 (pp. 318324); Wang, G.,Yeung, D. Y., & Lochovsky, F. (2007). The kernel path in kernelized LASSO. In International conference on artificial intelligence and statistics (pp. 580587). San Juan, Puerto Rico: MIT Press]. This paper is concerned with learning kernels under the LASSO formulation via adopting a generative Bayesian learning and inference approach. A new robust learning algorithm is proposed which produces a sparse kernel model with the capability of learning regularized parameters and kernel hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine(RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given. The new algorithm is also demonstrated to possess considerable computational advantages.
    Original languageEnglish
    Pages (from-to)257-264
    Number of pages8
    JournalNeural Networks
    Volume23
    Issue number2
    DOIs
    Publication statusPublished - Mar 2010

    Fingerprint

    Dive into the research topics of 'Sparse kernel learning with LASSO and Bayesian inference algorithm'. Together they form a unique fingerprint.

    Cite this