Kopa: Automated kronecker product approximation

C Cai, R Chen, H Xiao - Journal of machine learning research, 2022 - jmlr.org
We consider the problem of matrix approximation and denoising induced by the Kronecker
product decomposition. Specifically, we propose to approximate a given matrix by the sum of …

Matrix norm estimation from a few entries

A Khetan, S Oh - Advances in Neural Information …, 2017 - proceedings.neurips.cc
Singular values of a data in a matrix form provide insights on the structure of the data, the
effective dimensionality, and the choice of hyper-parameters on higher-level data analysis …

Learning-based low-rank approximations

P Indyk, A Vakilian, Y Yuan - Advances in Neural …, 2019 - proceedings.neurips.cc
We introduce a “learning-based” algorithm for the low-rank decomposition problem: given
an $ n\times d $ matrix $ A $, and a parameter $ k $, compute a rank-$ k $ matrix $ A'$ that …

Deterministic column sampling for low-rank matrix approximation: Nyström vs. incomplete Cholesky decomposition

R Patel, T Goldstein, E Dyer, A Mirhoseini… - Proceedings of the 2016 …, 2016 - SIAM
Kernel matrices that encode the distance (or similarity) between data points are widely used
throughout the computational sciences for classification, clustering, and dimensionality …

[PDF][PDF] Improving CUR matrix decomposition and the Nyström approximation via adaptive sampling

S Wang, Z Zhang - The Journal of Machine Learning Research, 2013 - jmlr.org
The CUR matrix decomposition and the Nyström approximation are two important low-rank
matrix approximation techniques. The Nyström method approximates a symmetric positive …

Optimal sketching for kronecker product regression and low rank approximation

H Diao, R Jayaram, Z Song, W Sun… - Advances in neural …, 2019 - proceedings.neurips.cc
We study the Kronecker product regression problem, in which the design matrix is a
Kronecker product of two or more matrices. Formally, given $ A_i\in\R^{n_i\times d_i} $ for …

Fundamental limits for rank-one matrix estimation with groupwise heteroskedasticity

JK Behne, G Reeves - International Conference on Artificial …, 2022 - proceedings.mlr.press
Low-rank matrix recovery problems involving high-dimensional and heterogeneous data
appear in applications throughout statistics and machine learning. The contribution of this …

[PDF][PDF] Simultaneous pursuit of sparseness and rank structures for matrix decomposition

Q Yan, J Ye, X Shen - The Journal of Machine Learning Research, 2015 - jmlr.org
In multi-response regression, pursuit of two different types of structures is essential to battle
the curse of dimensionality. In this paper, we seek a sparsest decomposition representation …

Multi-scale Nystrom method

W Lim, R Du, B Dai, K Jung, L Song… - International …, 2018 - proceedings.mlr.press
Kernel methods are powerful tools for modeling nonlinear data. However, the amount of
computation and memory required for kernel methods becomes the bottleneck when dealing …

Regularization of the kernel matrix via covariance matrix shrinkage estimation

T Lancewicki - arXiv preprint arXiv:1707.06156, 2017 - arxiv.org
The kernel trick concept, formulated as an inner product in a feature space, facilitates
powerful extensions to many well-known algorithms. While the kernel matrix involves inner …