作者
John C Duchi, Alekh Agarwal, Mikael Johansson, Michael I Jordan
发表日期
2012
期刊
SIAM Journal on Optimization
卷号
22
期号
4
页码范围
1549-1578
出版商
Society for Industrial and Applied Mathematics
简介
We generalize stochastic subgradient descent methods to situations in which we do not receive independent samples from the distribution over which we optimize, instead receiving samples coupled over time. We show that as long as the source of randomness is suitably ergodic---it converges quickly enough to a stationary distribution---the method enjoys strong convergence guarantees, both in expectation and with high probability. This result has implications for stochastic optimization in high-dimensional spaces, peer-to-peer distributed optimization schemes, decision problems with dependent data, and stochastic optimization problems over combinatorial spaces.
引用总数
20122013201420152016201720182019202020212022202320241523101098107282128
学术搜索中的文章
JC Duchi, A Agarwal, M Johansson, MI Jordan - SIAM Journal on Optimization, 2012