Approximate L0 constrained Non-negative Matrix and Tensor Factorization



AbstractNon-negative matrix factorization (NMF), i.e. V=WH where both V, W and
H are non-negative has become a widely used blind source separation technique due to its part based representation. The NMF decomposition is not in general unique and a part based representation not guaranteed. However, imposing sparseness both improves the uniqueness of the decomposition and favors part based representation. Sparseness in the form of attaining as many zero elements in the solution as possible is appealing from a conceptional point of view and corresponds to minimizing reconstruction error with an L0 norm constraint. In general, solving for a given L0 norm is an NP hard problem thus convex relaxation regularization by the L1 norm is often considered,
i.e., minimizing 0.5||V-WH||_F^2+lambda|H|_1). An open problem is to control the degree of sparsity $lambda$ imposed. We here demonstrate that a full regularization path for the L1 norm regularized least squares NMF for fixed
W can be calculated at the cost of an ordinary least squares solution based on a modification of the Least Angle Regression and Selection (LARS) algorithm forming a non-negativity constrained LARS (NLARS). With the full regularization path, the L1 regularization strength lambda that best approximates a given L0 can be directly accessed and in effect used to control
the sparsity of H. The MATLAB code for the NLARS algorithm is available for download.
KeywordsNMF, sparse coding, L0-norm, L1 regularization, non-negative LARS, BSS
TypeConference paper [With referee]
ConferenceIEEE International Symposium on Circuits and Systems, ISCAS 2008
Year2008    pp. 1328 - 1331
Electronic version(s)[pdf]
BibTeX data [bibtex]
IMM Group(s)Intelligent Signal Processing