Sharp analysis of stochastic optimization under global Kurdyka-Lojasiewicz inequality

I Fatkhullin, J Etesami, N He… - Advances in Neural …, 2022 - proceedings.neurips.cc
We study the complexity of finding the global solution to stochastic nonconvex optimization
when the objective function satisfies global Kurdyka-{\L} ojasiewicz (KL) inequality and the …

From gradient flow on population loss to learning with stochastic gradient descent

CM De Sa, S Kale, JD Lee… - Advances in Neural …, 2022 - proceedings.neurips.cc
Abstract Stochastic Gradient Descent (SGD) has been the method of choice for learning
large-scale non-convex models. While a general analysis of when SGD works has been …

Nonconvex matrix factorization is geodesically convex: Global landscape analysis for fixed-rank matrix optimization from a riemannian perspective

Y Luo, NG Trillos - arXiv preprint arXiv:2209.15130, 2022 - arxiv.org
We study a general matrix optimization problem with a fixed-rank positive semidefinite (PSD)
constraint. We perform the Burer-Monteiro factorization and consider a particular …

How over-parameterization slows down gradient descent in matrix sensing: The curses of symmetry and initialization

N Xiong, L Ding, SS Du - arXiv preprint arXiv:2310.01769, 2023 - arxiv.org
This paper rigorously shows how over-parameterization changes the convergence
behaviors of gradient descent (GD) for the matrix sensing problem, where the goal is to …

Inductive matrix completion: No bad local minima and a fast algorithm

P Zilber, B Nadler - International Conference on Machine …, 2022 - proceedings.mlr.press
The inductive matrix completion (IMC) problem is to recover a low rank matrix from few
observed entries while incorporating prior knowledge about its row and column subspaces …

Factorization approach for low-complexity matrix completion problems: Exponential number of spurious solutions and failure of gradient methods

B Yalçın, H Zhang, J Lavaei… - … Conference on Artificial …, 2022 - proceedings.mlr.press
Burer-Monteiro (BM) factorization approach can efficiently solve low-rank matrix optimization
problems under the Restricted Isometry Property (RIP) condition. It is natural to ask whether …

From Optimization to Sampling via Lyapunov Potentials

AY Chen, K Sridharan - arXiv preprint arXiv:2410.02979, 2024 - arxiv.org
We study the problem of sampling from high-dimensional distributions using Langevin
Dynamics, a natural and popular variant of Gradient Descent where at each step …

Noisy low-rank matrix optimization: Geometry of local minima and convergence rate

Z Ma, S Sojoudi - International Conference on Artificial …, 2023 - proceedings.mlr.press
This paper is concerned with low-rank matrix optimization, which has found a wide range of
applications in machine learning. This problem in the special case of matrix sensing has …

Absence of spurious solutions far from ground truth: A low-rank analysis with high-order losses

Z Ma, Y Chen, J Lavaei… - … Conference on Artificial …, 2024 - proceedings.mlr.press
Matrix sensing problems exhibit pervasive non-convexity, plaguing optimization with a
proliferation of suboptimal spurious solutions. Avoiding convergence to these critical points …

A new complexity metric for nonconvex rank-one generalized matrix completion

H Zhang, B Yalcin, J Lavaei, S Sojoudi - Mathematical Programming, 2024 - Springer
In this work, we develop a new complexity metric for an important class of low-rank matrix
optimization problems in both symmetric and asymmetric cases, where the metric aims to …