An introduction to Variational calculus in Machine Learning | Anders Meng
| Abstract | The intention of this note is not to give a full understanding of calculus of variations since this area are simply to big, however the note is meant as an appetizer. Classical variational methods concerns the fields of finding the extremum of an integral depending on an unknown function and its derivatives. Methods as the finite element method, used widely in many software packages for solving partial differential equations is using a variational approach as well as e.g. maximum entropy estimation [6]. Another intuitively example which is derived in many textbooks on calculus of variations; consider you have a line integral in euclidian space between two points a and b . To minimize the line integral (functional) with respect to the functions describing the path, one finds that a linear function minimizes the line-integral. This is of no surprise, since we are working in an euclidian space, however, if the integral is not as easy to interpret, calculus of variations comes in handy in the more general case. This little note will mainly concentrate on a specific example, namely the Variational EM algorithm for incomplete data. | Keywords | Variational methods, Machine Learning, EM | Type | Misc [Other] | Year | 2004 Month February | Publisher | Intelligent Signal Processing | Note | Any comments and suggestions for improvement will be appreciated. | Electronic version(s) | [pdf] | BibTeX data | [bibtex] | IMM Group(s) | Intelligent Signal Processing |
|