作者
Sungjin Ahn, Anoop Korattikara, Nathan Liu, Suju Rajan, Max Welling
发表日期
2015/8/10
图书
Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining
页码范围
9-18
简介
Despite having various attractive qualities such as high prediction accuracy and the ability to quantify uncertainty and avoid over-fitting, Bayesian Matrix Factorization has not been widely adopted because of the prohibitive cost of inference. In this paper, we propose a scalable distributed Bayesian matrix factorization algorithm using stochastic gradient MCMC. Our algorithm, based on Distributed Stochastic Gradient Langevin Dynamics, can not only match the prediction accuracy of standard MCMC methods like Gibbs sampling, but at the same time is as fast and simple as stochastic gradient descent. In our experiments, we show that our algorithm can achieve the same level of prediction accuracy as Gibbs sampling an order of magnitude faster. We also show that our method reduces the prediction error as fast as distributed stochastic gradient descent, achieving a 4.1% improvement in RMSE for the Netflix dataset …
引用总数
201520162017201820192020202120222023202431113911116321
学术搜索中的文章
S Ahn, A Korattikara, N Liu, S Rajan, M Welling - Proceedings of the 21th ACM SIGKDD international …, 2015