TY - GEN
T1 - L-1 LASSO modeling and its Bayesian inference
AU - Gao, Junbin
AU - Antolovich, Michael
AU - Kwan, Paul W.
PY - 2008/12/1
Y1 - 2008/12/1
N2 - A new iterative procedure for solving regression problems with the so-called LASSO penalty [1] is proposed by using generative Bayesian modeling and inference. The algorithm produces the anticipated parsimonious or sparse regression models that generalize well on unseen data. The proposed algorithm is quite robust and there is no need to specify any model hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given.
AB - A new iterative procedure for solving regression problems with the so-called LASSO penalty [1] is proposed by using generative Bayesian modeling and inference. The algorithm produces the anticipated parsimonious or sparse regression models that generalize well on unseen data. The proposed algorithm is quite robust and there is no need to specify any model hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given.
UR - http://www.scopus.com/inward/record.url?scp=58349096559&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=58349096559&partnerID=8YFLogxK
U2 - 10.1007/978-3-540-89378-3_31
DO - 10.1007/978-3-540-89378-3_31
M3 - Conference paper
AN - SCOPUS:58349096559
SN - 3540893776
SN - 9783540893776
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 318
EP - 324
BT - AI 2008
T2 - 21st Australasian Joint Conference on Artificial Intelligence, AI 2008
Y2 - 1 December 2008 through 5 December 2008
ER -