C Musco, C Musco - arXiv preprint arXiv:2004.08434, 2020 - arxiv.org
In this note we illustrate how common matrix approximation methods, such as random projection and random sampling, yield projection-cost-preserving sketches, as introduced in …
In this paper we consider the problem of recovering a low-rank Tucker approximation to a massive tensor based solely on structured random compressive measurements. Crucially …
Tensor representations of data have great promise, since as the size of data grows both in terms of dimensionality and modes, it becomes increasingly advantageous to employ …
Matrices are ubiquitous mathematical structures that arise throughout computer science. We study fast algorithms for several central problems involving matrices, including eigenvalue …
Recently, in statistics and machine learning, the notion of Randomization in Numerical Linear Algebra (RandNLA) has not only evolved as a vital new tool to design fast and …
Let us restate the theorem. Recapping notation, let A∈ Rn× d be an input matrix. Let a1,..., an denote its rows. Let k be the target low rank, and let ϵ> 0 be an error parameter. Let S0∈ …