We study the scaling limits of stochastic gradient descent (SGD) with constant step-size in the high-dimensional regime. We prove limit theorems for the trajectories of summary …
These notes survey and explore an emerging method, which we call the low-degree method, for understanding statistical-versus-computational tradeoffs in high-dimensional …
Mathematics in Zurich has a long and distinguished tradition, in which the writing of lecture notes volumes and research monographs plays a prominent part. The Zurich Lectures in …
Many high-dimensional statistical inference problems are believed to possess inherent computational hardness. Various frameworks have been proposed to give rigorous …
M Brennan, G Bresler - Conference on Learning Theory, 2020 - proceedings.mlr.press
Inference problems with conjectured statistical-computational gaps are ubiquitous throughout modern statistics, computer science, statistical physics and discrete probability …
Despite the non-convex optimization landscape, over-parametrized shallow networks are able to achieve global convergence under gradient descent. The picture can be radically …
T Liang, P Sur - The Annals of Statistics, 2022 - projecteuclid.org
A precise high-dimensional asymptotic theory for boosting and minimum-l1-norm interpolated classifiers Page 1 The Annals of Statistics 2022, Vol. 50, No. 3, 1669–1695 …
Researchers currently use a number of approaches to predict and substantiate information- computation gaps in high-dimensional statistical estimation problems. A prominent …