Towards faster non-asymptotic convergence for diffusion-based generative models

G Li, Y Wei, Y Chen, Y Chi - arXiv preprint arXiv:2306.09251, 2023 - arxiv.org
Diffusion models, which convert noise into new data instances by learning to reverse a
Markov diffusion process, have become a cornerstone in contemporary generative …

The lasso with general gaussian designs with applications to hypothesis testing

M Celentano, A Montanari, Y Wei - The Annals of Statistics, 2023 - projecteuclid.org
The Lasso with general Gaussian designs with applications to hypothesis testing Page 1 The
Annals of Statistics 2023, Vol. 51, No. 5, 2194–2220 https://doi.org/10.1214/23-AOS2327 © …

Universality of approximate message passing algorithms and tensor networks

T Wang, X Zhong, Z Fan - The Annals of Applied Probability, 2024 - projecteuclid.org
The supplementary appendix contains additional details about AMP algorithms for
rectangular matrices and the rectangular generalized invariant universality class of …

Approximate message passing from random initialization with applications to Z2 synchronization

G Li, W Fan, Y Wei - … of the National Academy of Sciences, 2023 - National Acad Sciences
This paper is concerned with the problem of reconstructing an unknown rank-one matrix with
prior structural information from noisy observations. While computing the Bayes optimal …

A sharp convergence theory for the probability flow odes of diffusion models

G Li, Y Wei, Y Chi, Y Chen - arXiv preprint arXiv:2408.02320, 2024 - arxiv.org
Diffusion models, which convert noise into new data instances by learning to reverse a
diffusion process, have become a cornerstone in contemporary generative modeling. In this …

Local convexity of the TAP free energy and AMP convergence for -synchronization

M Celentano, Z Fan, S Mei - The Annals of Statistics, 2023 - projecteuclid.org
Local convexity of the TAP free energy and AMP convergence for Z2-synchronization Page 1
The Annals of Statistics 2023, Vol. 51, No. 2, 519–546 https://doi.org/10.1214/23-AOS2257 © …

Semidefinite programs simulate approximate message passing robustly

M Ivkov, T Schramm - Proceedings of the 56th Annual ACM Symposium …, 2024 - dl.acm.org
Approximate message passing (AMP) is a family of iterative algorithms that generalize
matrix power iteration. AMP algorithms are known to optimally solve many average-case …

Spectrum-aware adjustment: A new debiasing framework with applications to principal components regression

Y Li, P Sur - arXiv preprint arXiv:2309.07810, 2023 - arxiv.org
We introduce a new debiasing framework for high-dimensional linear regression that
bypasses the restrictions on covariate distributions imposed by modern debiasing …

Mean-field variational inference with the TAP free energy: Geometric and statistical properties in linear models

M Celentano, Z Fan, L Lin, S Mei - arXiv preprint arXiv:2311.08442, 2023 - arxiv.org
We study mean-field variational inference in a Bayesian linear model when the sample size
n is comparable to the dimension p. In high dimensions, the common approach of …

A non-asymptotic analysis of generalized approximate message passing algorithms with right rotationally invariant designs

C Cademartori, C Rush - arXiv preprint arXiv:2302.00088, 2023 - arxiv.org
Approximate Message Passing (AMP) algorithms are a class of iterative procedures for
computationally-efficient estimation in high-dimensional inference and estimation tasks. Due …