TR-H-0282 :2000.1.13

Masa-aki SATO

On-Line Model Selection Based on The Variational Bayes

Abstract:The Bayesian framework provides a principled way of the model selection. This framework estimates a probability distribution over an ensemble of models and the prediction is done by averaging over the ensemble of models. Accordingly, the uncertainty of the models is taken into account and complex models with more degrees of freedom are penalized. However, integration over model parameters is often intractable and some approximation scheme is needed. Recently, a powerful approximation scheme called the Variational Bayes (VB) method has been proposed by Attias (1999). This approach defines the free energy for a trial probability distribution, which approximates a joint posterior probability distribution over model parameters and hidden variables. The exact maximization of the free energy gives the true posterior distribution. The VB method uses factorized trial distributions. The integration over model parameters can be done analytically, and an iterative EM-like algorithm, whose convergence is guaranteed, is derived. In this paper, we derive an on-line version of the VB algorithm and prove its convergence by showing that it is a stochastic approximation for finding the maximum of the free energy. By combining the split and merge algorithm proposed by Ueda et al. (1999), the on-line VB algorithm provides a fully on-line learning method with a model selection mechanism. In preliminary experiments using synthetic data, the on-line VB method showed a faster and better performance than the batch VB method.