Sparse Kernel Learning and the Relevance Units Machine

Junbin Gao, Jun Zhang

Research output: Book chapter/Published conference paperConference paper

5 Citations (Scopus)
25 Downloads (Pure)

Abstract

The relevance vector machine(RVM) is a state-of-the-art constructing sparse regression kernel model [1'4]. It not only generates a much sparser model but provides better generalization performance than the standard support vector machine (SVM). In RVM and SVM, relevance vectors (RVs) and support vectors (SVs) are both selected from the input vector set. This may limit model flexibility. In this paper we propose a new sparse kernel model called Relevance Units Machine (RUM).RUM follows the idea of RVM under the Bayesian framework but releases the constraint that RVs have to be selected from the input vectors. RUM treats relevance units as part of the parameters of the model. As a result,a RUM maintains all the advantages of RVM and offers superior sparsity. The new algorithm is demonstrated to possess considerable computational advantages over well-known the state-of-the-art algorithms.
Original languageEnglish
Title of host publicationPAKDD 2009
Place of PublicationBerlin
PublisherSpringer
Pages612-619
Number of pages8
Volume5476/2009
DOIs
Publication statusPublished - 2009
EventPacific-Asia Conference on Knowledge Discovery and Data Mining - Bangkok, Thailand
Duration: 27 Apr 200930 Apr 2009

Conference

ConferencePacific-Asia Conference on Knowledge Discovery and Data Mining
CountryThailand
Period27/04/0930/04/09

Fingerprint Dive into the research topics of 'Sparse Kernel Learning and the Relevance Units Machine'. Together they form a unique fingerprint.

  • Cite this

    Gao, J., & Zhang, J. (2009). Sparse Kernel Learning and the Relevance Units Machine. In PAKDD 2009 (Vol. 5476/2009, pp. 612-619). Springer. https://doi.org/10.1007/978-3-642-01307-2_60