Multiplicative updates for the LASSO



AbstractMultiplicative updates have proven useful for non-negativity
constrained optimization. Presently, we demonstrate how
multiplicative updates also can be used for unconstrained
optimization. This is for instance useful when estimating the
least absolute shrinkage and selection operator (LASSO), i.e.
least squares minimization with $L_1$-norm regularization, since
the multiplicative updates (MU) can efficiently exploit the
structure of the problem traditionally solved using quadratic
programming (QP). We derive an algorithm based on MU for the LASSO
and compare the performance to Matlabs standard QP solver as well
as the basis pursuit denoising algorithm (BP) which can be
obtained from
www.sparselab.stanford.edu. The algorithms were tested on three
benchmark bio-informatic datasets: A small scale data set where
the number of observations is larger than the number of variables
estimated ($M<J$) and two large scale microarray data sets ($Mgg
J$). For small scale data MU, QP and BP give identical results
while the time used is more or less of the same order. However,
for large scale problems QP is unstable and slow. MU on the other
hand is stable and faster but not as efficient as the BP algorithm
and converge slowly for small regularizations. The benefit of the
present MU algorithm is that it is easy to implement, it bridges
multiplicative updates to unconstrained optimization and the
updates derived monotonically decrease the cost-function thus does
not need any objective function evaluation. Finally, the MU are
potentially useful for a wide range of other models such as the
elastic net or the fused LASSO. A Matlab implementation of the
LASSO based on MU is available.
TypeConference paper [With referee]
ConferenceMachine Learning for Signal Processing (MLSP), IEEE Workshop on
Year2007
Electronic version(s)[pdf]
BibTeX data [bibtex]
IMM Group(s)Intelligent Signal Processing