Incremental Gaussian Processes 
Joaquin QuiñoneroCandela, Ole Winther

Abstract  In this paper, we consider Tipping's relevance vector machine (RVM) and formalize an incremental training strategy as a variant of the expectationmaximization (EM) algorithm that we call subspace EM. Working with a subset of active basis functions, the sparsity of the RVM solution will ensure that the number of basis functions and thereby the computational complexity is kept low. We also introduce a mean field approach to the intractable classification
model that is expected to give a very good approximation to exact Bayesian inference and contains the Laplace approximation as a special case. We test the algorithms on two large data sets with 10^310^4 examples. The results indicate that Bayesian learning of large data sets, e.g. the MNIST database is realistic. 
Keywords  Gaussian Processes, Incremental Methods, Bayesian Kernel Methods, Mean Field Classification, Computational Complexity 
Type  Conference paper [With referee] 
Conference  Advances in Neural Processing Systems 
Year  2002 
Electronic version(s)  [ps] 
BibTeX data  [bibtex] 
IMM Group(s)  Intelligent Signal Processing 