Vector-Quantization using Information Theoretic Concepts



AbstractThe process of representing a large data set with a smaller number of vectors in the best possible way, also known as vector quantization, has been intensively studied in the recent years. Very efficient algorithms like the Kohonen Self Organizing Map (SOM) and the Linde Buzo Gray (LBG) algorithm have been devised. In this paper a physical approach to the problem is taken, and it is shown that by considering the processing elements as points moving in a potential field an algorithm equally efficient as the before mentioned can be derived. Unlike SOM and LBG this algorithm has a clear physical interpretation and relies on minimization of a well defined cost-function.
It is also shown how the potential field approach can be linked to information theory by use of the Parzen density estimator. In the light of information theory it becomes clear that minimizing the free energy of the system is in fact equivalent to minimizing a divergence measure between the distribution of the data and the distribution of the processing element, hence, the algorithm can be seen as a density matching method.
KeywordsInformation particles, Information theoretic learning, Self-organizing map, Vector-Quantization
TypeJournal paper [With referee]
JournalNatural Computing
Year2005    Month January    Vol. 4    pp. 39--51
PublisherKluwer Academic Publishers
ISBN / ISSN1567-7818
Electronic version(s)[pdf]
Publication linkhttp://dx.doi.org/10.1007/s11047-004-9619-8
BibTeX data [bibtex]
IMM Group(s)Intelligent Signal Processing