作者
Penghang Yin, Minh Pham, Adam Oberman, Stanley Osher
发表日期
2018/11
期刊
Journal of Scientific Computing
卷号
77
页码范围
1133-1146
出版商
Springer US
简介
In this paper, we propose an implicit gradient descent algorithm for the classic k-means problem. The implicit gradient step or backward Euler is solved via stochastic fixed-point iteration, in which we randomly sample a mini-batch gradient in every iteration. It is the average of the fixed-point trajectory that is carried over to the next gradient step. We draw connections between the proposed stochastic backward Euler and the recent entropy stochastic gradient descent for improving the training of deep neural networks. Numerical experiments on various synthetic and real datasets show that the proposed algorithm provides better clustering results compared to k-means algorithms in the sense that it decreased the objective function (the cluster) and is much more robust to initialization.
引用总数
2019202020212022202324215
学术搜索中的文章