Randomly pivoted Cholesky: Practical approximation of a kernel matrix with few entry evaluations

Y Chen, EN Epperly, JA Tropp… - … on Pure and Applied …, 2023 - Wiley Online Library
The randomly pivoted Cholesky algorithm (RPCholesky) computes a factorized rank‐kk
approximation of an N× NN*N positive‐semidefinite (psd) matrix. RPCholesky requires only …

Fast and stable randomized low-rank matrix approximation

Y Nakatsukasa - arXiv preprint arXiv:2009.11392, 2020 - arxiv.org
Randomized SVD has become an extremely successful approach for efficiently computing a
low-rank approximation of matrices. In particular the paper by Halko, Martinsson, and Tropp …

Optimal sketching for kronecker product regression and low rank approximation

H Diao, R Jayaram, Z Song, W Sun… - Advances in neural …, 2019 - proceedings.neurips.cc
We study the Kronecker product regression problem, in which the design matrix is a
Kronecker product of two or more matrices. Formally, given $ A_i\in\R^{n_i\times d_i} $ for …

Generalized leverage score sampling for neural networks

JD Lee, R Shen, Z Song… - Advances in Neural …, 2020 - proceedings.neurips.cc
Leverage score sampling is a powerful technique that originates from theoretical computer
science, which can be used to speed up a large number of fundamental questions, eg linear …

Fixed-rank approximation of a positive-semidefinite matrix from streaming data

JA Tropp, A Yurtsever, M Udell… - Advances in Neural …, 2017 - proceedings.neurips.cc
Several important applications, such as streaming PCA and semidefinite programming,
involve a large-scale positive-semidefinite (psd) matrix that is presented as a sequence of …

Low-rank approximation with 1/𝜖1/3 matrix-vector products

A Bakshi, KL Clarkson, DP Woodruff - … of the 54th Annual ACM SIGACT …, 2022 - dl.acm.org
We study iterative methods based on Krylov subspaces for low-rank approximation under
any Schatten-p norm. Here, given access to a matrix A through matrix-vector products, an …

Quantum-inspired algorithms from randomized numerical linear algebra

N Chepurko, K Clarkson, L Horesh… - International …, 2022 - proceedings.mlr.press
We create classical (non-quantum) dynamic data structures supporting queries for
recommender systems and least-squares regression that are comparable to their quantum …

Krylov methods are (nearly) optimal for low-rank approximation

A Bakshi, S Narayanan - 2023 IEEE 64th Annual Symposium …, 2023 - ieeexplore.ieee.org
We consider the problem of rank-1 low-rank approximation (LRA) in the matrix-vector
product model under various Schatten norms: _ ‖ u ‖ _ 2= 1\left ‖ A\left (Iu u …

Recent and upcoming developments in randomized numerical linear algebra for machine learning

M Dereziński, MW Mahoney - Proceedings of the 30th ACM SIGKDD …, 2024 - dl.acm.org
Large matrices arise in many machine learning and data analysis applications, including as
representations of datasets, graphs, model weights, and first and second-order derivatives …

Hardness of low rank approximation of entrywise transformed matrix products

T Sarlos, X Song, D Woodruff… - Advances in Neural …, 2024 - proceedings.neurips.cc
Inspired by fast algorithms in natural language processing, we study low rank approximation
in the entrywise transformed setting where we want to find a good rank $ k $ approximation …