A new EM algorithm for flexibly tied GMMs with large number of components

H Asheri, R Hosseini, BN Araabi - Pattern recognition, 2021 - Elsevier
Pattern recognition, 2021Elsevier
Gaussian mixture models (GMMs) are a family of generative models used extensively in
many machine learning applications. The modeling power of GMMs is directly linked to the
number of components. Memory, computational load and lack of enough data hinders using
GMMs with large number of components. To tackle this problem, GMMs with a tying scheme
that we call flexibly tied GMM was proposed in the literature of the speech recognition
community. In the literature, a coordinate-descent EM algorithm was proposed for estimating …
Abstract
Gaussian mixture models (GMMs) are a family of generative models used extensively in many machine learning applications. The modeling power of GMMs is directly linked to the number of components. Memory, computational load and lack of enough data hinders using GMMs with large number of components. To tackle this problem, GMMs with a tying scheme that we call flexibly tied GMM was proposed in the literature of the speech recognition community. In the literature, a coordinate-descent EM algorithm was proposed for estimating the parameters of flexibly tied GMMs.
In this paper, we aim at reintroducing flexibly tied GMMs to the pattern recognition community. We rigorously investigate various optimization methods and see none of the out-of-the-box optimization methods can solve the parameter estimation problem due to the complexity of the cost function. To this end, we develop a fast Newton EM algorithm that combined with the coordinate descent EM algorithm, it significantly outperforms pure coordinate descent EM and all other optimization algorithms. Furthermore, we propose a computation factorization technique to increase the speed and decrease memory requirement of both Newton and coordinate descent EM algorithms in the case of large number of components. Experimental results on many datasets verifies the efficacy of the proposed algorithm. It also verifies that flexibly tied GMM outperforms both basic GMM and other types of tied GMMs on the datasets in terms of the log-likelihood. We also evaluate the performance of flexibly tied GMM on a clustering problem, and show that it can outperform basic GMM and kmeans algorithm.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果