TR-H-0281 :2000.1.13 ( Internal Use )

Masa-aki SATO

Fast Learning of On-Line EM Algorithm

Abstract:In this article, an on-line EM algorithm is derived for general Exponential Family models with Hidden variables (EFH models). It is proven that the on-line EM algorithm is equivalent to a stochastic gradient method with the inverse of the Fisher information matrix as a coefficient matrix. As a result, the stochastic approximation theory guarantees the convergence to a local maximum of the likelihood function. The performance of the on-line EM algorithm is examined by using the mixture of Gaussian model, which is a special type of the EFH model. The simulation results show that the on-line EM algorithm is much faster than the batch EM algorithm and the on-line gradient ascent algorithm. The fast learning speed is achieved by the systematic design of the learning rate schedule. Moreover, it is shown that the on-line EM algorithm can escape from a local maximum of the likelihood function in the early training phase, even when the batch EM algorithm is trapped to a local maximum solution.

Keywords: On-line algorithm, EM algorithm, Convergence, Mixture models, Exponential family models