Summer School on

Sparsity in Image and Signal Analysis

At Hólar, Iceland, August 16 - 20, 2010 (both days included)

Alan Aasbjerg Nielsen

Sparse linear and kernel minimum noise fraction (MNF) transformation

This contribution describes work in progress on combining the established (linear) minimum noise fraction (MNF) transformation and a newly developed kernel version of MNF with sparsity or elastic net constraints. The linear MNF problem (normally formulated in R-mode also known as the primal formulation) minimizes a Rayleigh quotient equal to the variance of noise according to some noise model over the total variance of a transformation of the original variables. This is equivalent to maximizing a measure of signal-to-noise ratio (SNR) and corresponds to solving a generalized eigenvalue problem. The kernel version of MNF relies on a Q-mode also known as a dual formulation of the generalized eigenvalue problem combined with a nonlinear transformation of the originally measured variables into higher dimensional (even indefinite) feature space. This transformation is expressed implicitly via kernel substitution also known as the kernel trick. These problems are combined with the (L2 and) L1 norm constraints known from sparse principal component analysis (PCA) and sparse Fisher's discriminant analysis (FDA) to obtain sparse versions of linear and kernel MNF analysis. This type of analysis is expected to yield information as to which variables (R-mode) or which observations (Q-mode) are important for obtaining maximum SNR of transformations of the original data.

Summer School on Sparsity in Image and Signal Analysis, Hólar, Iceland
dinariis@diku.dk