L1 LASSO Modeling and Its Bayesian Inference

Junbin Gao, Michael Antolovich, Paul Kwan

Research output: Book chapter/Published conference paperConference paperpeer-review

41 Downloads (Pure)

Abstract

A new iterative procedure for solving regression problems with the so-called LASSO penalty [1] is proposed by using generative Bayesian modeling and inference. The algorithm produces the anticipated parsimonious or sparse regression models that generalize well on unseen data. The proposed algorithm is quite robust and there is noneed to specify any model hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given.
Original languageEnglish
Title of host publicationAI 2008 21st conference
Subtitle of host publicationAdvances in artificial intelligence
EditorsW. Wobcke, M. Zhang
Place of PublicationNetherland
PublisherSpringer
Pages318-324
Number of pages7
Volume5360/2008
DOIs
Publication statusPublished - 2008
EventAustralian Joint Conference on Artificial Intelligence - Auckland, New Zealand, New Zealand
Duration: 01 Dec 200805 Dec 2008

Conference

ConferenceAustralian Joint Conference on Artificial Intelligence
Country/TerritoryNew Zealand
Period01/12/0805/12/08

Fingerprint

Dive into the research topics of 'L1 LASSO Modeling and Its Bayesian Inference'. Together they form a unique fingerprint.

Cite this