Variational Bayes Latent Variable Models And Mixture Extensions  Slimane Bazaou
 Abstract  This thesis is concerned with the problem of applying an approximate Bayesian learning technique referred to as variational Bayes to different Gaussian latent variable models and their mixture extensions.
I will try to give a smooth transition between the different models used in this thesis, starting from the single (multivariate) Gaussian model to the more complex Linear Factor and its mixture extension Mixture of Factor Analyzers model, where either/both the hidden dimensionality and the hidden number of components (in the case of mixtures) are unknown.
One of the aims of this thesis is to investigate how the Bayesian framework infers the wanted parameters, e.g. number of components in a mixture model, given a model, and how it succeeds in solving the di erent problems related to overfitting. I also investigate which one of these models that perform best for a range of tasks. Throughout the report I will try to discuss the performance of the Bayesian techniques, mainly by comparing it to the standard Maximum Likelihood approach.
Each of the models discussed in the thesis are applied to one or more of the following problems: Density estimation, Classification, Signal separation and Image compression. Both synthetic and real data are tested.  Keywords  Graphical Models, Linear latent variable models, Mixture Models, Maximum Likelihood, Bayesian Inference, Variational Bayes  Type  Master's thesis [Academic thesis]  Year  2004  Publisher  Informatics and Mathematical Modelling, Technical University of Denmark, DTU  Address  Richard Petersens Plads, Building 321, DK2800 Kgs. Lyngby  Series  IMMThesis200448  Note  Supervised by Assoc. Prof. Ole Winther and Prof. Lars Kai Hansen  Electronic version(s)  [pdf] [ps]  BibTeX data  [bibtex]  IMM Group(s)  Intelligent Signal Processing 
Back :: IMM Publications
