Algorithms for Sparse Nonnegative TUCKER  Morten Mørup, Lars Kai Hansen, Sidse M. Arnfred
 Abstract  The analysis of large scale data of more modalities than two,
i.e. tensor, has lately become a eld of growing attention. To analyze
such data, decomposition techniques are widely used. The two most common
decompositions for tensors are the TUCKER model and the more
restricted PARAFAC model. Both models can be viewed as generalizations
of the regular factor analysis to data of more than two modalities.
Nonnegative matrix factorization, (NMF), in conjunction with sparse
coding has lately been given much attention due to its part based and
easy interpretable representation. While NMF has been extended to the
PARAFAC model no such attempt has been done to extend NMF to the
TUCKER model. However, if the tensor data analyzed is nonnegative
it may well be relevant to consider purely additive, i.e. nonnegative
TUCKER decompositions. To reduce ambiguities of this type of decomposition
we develop updates that can impose sparseness in any combination
of modalities. Hence, form algorithms for sparse nonnegative
TUCKER decompositions, (SNTUCKER). We demonstrate how the
proposed algorithms are superior to existing algorithms for TUCKER
decompositions when indeed the data and interactions can be considered
nonnegative. We further illustrate how sparse coding can help identify
what model, i.e. PARAFAC or TUCKER, is the most appropriate to the
data as well as to select the number of components by turning of excess
components. The algorithms for SNTUCKER are available.  Keywords  Tucker, PARAFAC, Sparse coding, Higher Order Nonnegative Matrix Factorization (HONMF)  Type  Journal paper [With referee]  Journal  Neural Computation  Year  2008 Month August Vol. 20 No. 8 pp. 21122131  Electronic version(s)  [pdf]  BibTeX data  [bibtex]  IMM Group(s)  Intelligent Signal Processing 
Back :: IMM Publications
