Abstract
The relevance vector machine(RVM) is a state-of-the-art constructing sparse regression kernel model [1'4]. It not only generates a much sparser model but provides better generalization performance than the standard support vector machine (SVM). In RVM and SVM, relevance vectors (RVs) and support vectors (SVs) are both selected from the input vector set. This may limit model flexibility. In this paper we propose a new sparse kernel model called Relevance Units Machine (RUM).RUM follows the idea of RVM under the Bayesian framework but releases the constraint that RVs have to be selected from the input vectors. RUM treats relevance units as part of the parameters of the model. As a result,a RUM maintains all the advantages of RVM and offers superior sparsity. The new algorithm is demonstrated to possess considerable computational advantages over well-known the state-of-the-art algorithms.
Original language | English |
---|---|
Title of host publication | PAKDD 2009 |
Place of Publication | Berlin |
Publisher | Springer |
Pages | 612-619 |
Number of pages | 8 |
Volume | 5476/2009 |
DOIs | |
Publication status | Published - 2009 |
Event | Pacific-Asia Conference on Knowledge Discovery and Data Mining - Bangkok, Thailand Duration: 27 Apr 2009 → 30 Apr 2009 |
Conference
Conference | Pacific-Asia Conference on Knowledge Discovery and Data Mining |
---|---|
Country/Territory | Thailand |
Period | 27/04/09 → 30/04/09 |