Multiplicative updates for the LASSO  Morten Mørup, Line H. Clemmensen
 Abstract  Multiplicative updates have proven useful for nonnegativity
constrained optimization. Presently, we demonstrate how
multiplicative updates also can be used for unconstrained
optimization. This is for instance useful when estimating the
least absolute shrinkage and selection operator (LASSO), i.e.
least squares minimization with $L_1$norm regularization, since
the multiplicative updates (MU) can efficiently exploit the
structure of the problem traditionally solved using quadratic
programming (QP). We derive an algorithm based on MU for the LASSO
and compare the performance to Matlabs standard QP solver as well
as the basis pursuit denoising algorithm (BP) which can be
obtained from
www.sparselab.stanford.edu. The algorithms were tested on three
benchmark bioinformatic datasets: A small scale data set where
the number of observations is larger than the number of variables
estimated ($M<J$) and two large scale microarray data sets ($Mgg
J$). For small scale data MU, QP and BP give identical results
while the time used is more or less of the same order. However,
for large scale problems QP is unstable and slow. MU on the other
hand is stable and faster but not as efficient as the BP algorithm
and converge slowly for small regularizations. The benefit of the
present MU algorithm is that it is easy to implement, it bridges
multiplicative updates to unconstrained optimization and the
updates derived monotonically decrease the costfunction thus does
not need any objective function evaluation. Finally, the MU are
potentially useful for a wide range of other models such as the
elastic net or the fused LASSO. A Matlab implementation of the
LASSO based on MU is available.  Type  Conference paper [With referee]  Conference  Machine Learning for Signal Processing (MLSP), IEEE Workshop on  Year  2007  BibTeX data  [bibtex]  IMM Group(s)  Intelligent Signal Processing 
Back :: IMM Publications
