Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions

N Halko, PG Martinsson, JA Tropp - SIAM review, 2011 - SIAM
Low-rank matrix approximations, such as the truncated singular value decomposition and
the rank-revealing QR decomposition, play a central role in data analysis and scientific …

Literature survey on low rank approximation of matrices

N Kishore Kumar, J Schneider - Linear and Multilinear Algebra, 2017 - Taylor & Francis
Low rank approximation of matrices has been well studied in literature. Singular value
decomposition, QR decomposition with column pivoting, rank revealing QR factorization …

Randomized numerical linear algebra: Foundations and algorithms

PG Martinsson, JA Tropp - Acta Numerica, 2020 - cambridge.org
This survey describes probabilistic algorithms for linear algebraic computations, such as
factorizing matrices and solving linear systems. It focuses on techniques that have a proven …

Sketching as a tool for numerical linear algebra

DP Woodruff - … and Trends® in Theoretical Computer Science, 2014 - nowpublishers.com
This survey highlights the recent advances in algorithms for numerical linear algebra that
have come from the technique of linear sketching, whereby given a matrix, one first …

[图书][B] Machine learning: a Bayesian and optimization perspective

S Theodoridis - 2015 - books.google.com
This tutorial text gives a unifying perspective on machine learning by covering both
probabilistic and deterministic approaches-which are based on optimization techniques …

Low-rank approximation and regression in input sparsity time

KL Clarkson, DP Woodruff - Journal of the ACM (JACM), 2017 - dl.acm.org
We design a new distribution over m× n matrices S so that, for any fixed n× d matrix A of rank
r, with probability at least 9/10,∥ SAx∥ 2=(1±ε)∥ Ax∥ 2 simultaneously for all x∈ R d …

Randomized algorithms for matrices and data

MW Mahoney - Foundations and Trends® in Machine …, 2011 - nowpublishers.com
Randomized algorithms for very large matrix problems have received a great deal of
attention in recent years. Much of this work was motivated by problems in large-scale data …

Turning Big Data Into Tiny Data: Constant-Size Coresets for -Means, PCA, and Projective Clustering

D Feldman, M Schmidt, C Sohler - SIAM Journal on Computing, 2020 - SIAM
We develop and analyze a method to reduce the size of a very large set of data points in a
high-dimensional Euclidean space R^d to a small set of weighted points such that the result …

Dimensionality reduction for k-means clustering and low rank approximation

MB Cohen, S Elder, C Musco, C Musco… - Proceedings of the forty …, 2015 - dl.acm.org
We show how to approximate a data matrix A with a much smaller sketch~ A that can be
used to solve a general class of constrained k-rank approximation problems to within (1+ ε) …

Hutch++: Optimal stochastic trace estimation

RA Meyer, C Musco, C Musco, DP Woodruff - Symposium on Simplicity in …, 2021 - SIAM
We study the problem of estimating the trace of a matrix A that can only be accessed through
matrix-vector multiplication. We introduce a new randomized algorithm, Hutch++, which …