Sparse model construction using coordinate descent optimization

Xia Hong, Yi Guo, Sheng Chen, Junbin Gao

Research output: Book chapter/Published conference paperConference paperpeer-review

1 Citation (Scopus)


We propose a new sparse model construction method aimed at maximizing a model's generalisation capability for a large class of linear-in-the-'parameters models. The coordinate descent optimization algorithm is employed with a modifiedl1- penalized least squares cost function in order to estimate a single parameter and its regularization parameter simultaneously based on the leave one out mean square error (LOOMSE). Our original contribution is to derive a closed form of optimal LOOMSE regularization parameter for a single term model, for which we show that the LOOMSE can be analytically computed without actually splitting the data set leading to a very simple parameter estimation method. We then integrate the new results within the coordinate descent optimization algorithm to update model parameters one at the time for linear-in-the-parameters models. Consequently a fully automated procedure is achieved without resort to any other validation data set for iterative model evaluation. Illustrative examples are included to demonstrate the effectiveness of the new approaches.
Original languageEnglish
Title of host publicationDSP 2013
Subtitle of host publication18th Proceedings
Place of PublicationUnited States
PublisherInstitute of Electrical and Electronics Engineers
Number of pages6
ISBN (Electronic)9781467358057
Publication statusPublished - 2013
EventInternational Conference on Digital Signal Processing (DSP) - Santorini, Greece, Greece
Duration: 01 Jul 201303 Jul 2013


ConferenceInternational Conference on Digital Signal Processing (DSP)

Grant Number

  • DP130100364

Fingerprint Dive into the research topics of 'Sparse model construction using coordinate descent optimization'. Together they form a unique fingerprint.

Cite this