作者
O Arandjelovic, R Cipolla
发表日期
2006/5/15
期刊
SME Technical Papers
期号
TP06PUB22
出版商
Society of Manufacturing Engineers
简介
In this paper we address the problem of learning Gaussian Mixture Models (GMMs) incrementally. Unlike previous approaches which universally assume that new data comes in blocks representable by GMMs which are then merged with the current model estimate, our method works for the case when novel data points arrive oneby- one, while requiring little additional memory. We keep only two GMMs in the memory and no historical data. The current fit is updated with the assumption that the number of components is fixed, which is increased (or reduced) when enough evidence for a new component is seen. This is deduced from the change from the oldest fit of the same complexity, termed the Historical GMM, the concept of which is central to our method. The performance of the proposed method is demonstrated qualitatively and quantitatively on several synthetic data sets and video sequences of faces acquired in realistic imaging conditions
引用总数
2005200620072008200920102011201220132014201520162017201820192020202120222023202414771119736510624171231