@MISC\{IMM2004-03314, author = "A. Meng", title = "An introduction to Variational calculus in Machine Learning", year = "2004", month = "feb", keywords = "Variational methods, Machine Learning, {EM}", publisher = "Intelligent Signal Processing", address = "", note = "Any comments and suggestions for improvement will be appreciated.", url = "http://www2.compute.dtu.dk/pubdb/pubs/3314-full.html", abstract = "The intention of this note is not to give a full understanding of calculus of variations since this area are simply to big, however the note is meant as an appetizer. Classical variational methods concerns the fields of finding the extremum of an integral depending on an unknown function and its derivatives. Methods as the finite element method, used widely in many software packages for solving partial differential equations is using a variational approach as well as e.g. maximum entropy estimation [6]. Another intuitively example which is derived in many textbooks on calculus of variations; consider you have a line integral in euclidian space between two points a and b . To minimize the line integral (functional) with respect to the functions describing the path, one finds that a linear function minimizes the line-integral. This is of no surprise, since we are working in an euclidian space, however, if the integral is not as easy to interpret, calculus of variations comes in handy in the more general case. This little note will mainly concentrate on a specific example, namely the Variational {EM} algorithm for incomplete data." }