Approximate L0 constrained Nonnegative Matrix and Tensor Factorization  Morten Mørup, Kristoffer Hougaard Madsen, Lars Kai Hansen
 Abstract  Nonnegative matrix factorization (NMF), i.e. V=WH where both V, W and
H are nonnegative has become a widely used blind source separation technique due to its part based representation. The NMF decomposition is not in general unique and a part based representation not guaranteed. However, imposing sparseness both improves the uniqueness of the decomposition and favors part based representation. Sparseness in the form of attaining as many zero elements in the solution as possible is appealing from a conceptional point of view and corresponds to minimizing reconstruction error with an L0 norm constraint. In general, solving for a given L0 norm is an NP hard problem thus convex relaxation regularization by the L1 norm is often considered,
i.e., minimizing 0.5VWH_F^2+lambdaH_1). An open problem is to control the degree of sparsity $lambda$ imposed. We here demonstrate that a full regularization path for the L1 norm regularized least squares NMF for fixed
W can be calculated at the cost of an ordinary least squares solution based on a modification of the Least Angle Regression and Selection (LARS) algorithm forming a nonnegativity constrained LARS (NLARS). With the full regularization path, the L1 regularization strength lambda that best approximates a given L0 can be directly accessed and in effect used to control
the sparsity of H. The MATLAB code for the NLARS algorithm is available for download.  Keywords  NMF, sparse coding, L0norm, L1 regularization, nonnegative LARS, BSS  Type  Conference paper [With referee]  Conference  IEEE International Symposium on Circuits and Systems, ISCAS 2008  Year  2008 pp. 1328  1331  Electronic version(s)  [pdf]  BibTeX data  [bibtex]  IMM Group(s)  Intelligent Signal Processing 
Back :: IMM Publications
