Variational Bayes Latent Variable Models And Mixture Extensions

Slimane Bazaou

AbstractThis thesis is concerned with the problem of applying an approximate Bayesian learning technique referred to as variational Bayes to different Gaussian latent variable models and their mixture extensions.

I will try to give a smooth transition between the different models used in this thesis, starting from the single (multivariate) Gaussian model to the more complex Linear Factor and its mixture extension Mixture of Factor Analyzers model, where either/both the hidden dimensionality and the hidden number of components (in the case of mixtures) are unknown.

One of the aims of this thesis is to investigate how the Bayesian framework infers the wanted parameters, e.g. number of components in a mixture model, given a model, and how it succeeds in solving the di erent problems related to overfitting. I also investigate which one of these models that perform best for a range of tasks. Throughout the report I will try to discuss the performance of the Bayesian techniques, mainly by comparing it to the standard Maximum Likelihood approach.

Each of the models discussed in the thesis are applied to one or more of the following problems: Density estimation, Classification, Signal separation and Image compression. Both synthetic and real data are tested.
KeywordsGraphical Models, Linear latent variable models, Mixture Models, Maximum Likelihood, Bayesian Inference, Variational Bayes
TypeMaster's thesis [Academic thesis]
PublisherInformatics and Mathematical Modelling, Technical University of Denmark, DTU
AddressRichard Petersens Plads, Building 321, DK-2800 Kgs. Lyngby
NoteSupervised by Assoc. Prof. Ole Winther and Prof. Lars Kai Hansen
Electronic version(s)[pdf] [ps]
BibTeX data [bibtex]
IMM Group(s)Intelligent Signal Processing

Back  ::  IMM Publications