A Khetan, S Oh - Advances in Neural Information …, 2017 - proceedings.neurips.cc
Singular values of a data in a matrix form provide insights on the structure of the data, the effective dimensionality, and the choice of hyper-parameters on higher-level data analysis …
We introduce a “learning-based” algorithm for the low-rank decomposition problem: given an $ n\times d $ matrix $ A $, and a parameter $ k $, compute a rank-$ k $ matrix $ A'$ that …
Kernel matrices that encode the distance (or similarity) between data points are widely used throughout the computational sciences for classification, clustering, and dimensionality …
S Wang, Z Zhang - The Journal of Machine Learning Research, 2013 - jmlr.org
The CUR matrix decomposition and the Nyström approximation are two important low-rank matrix approximation techniques. The Nyström method approximates a symmetric positive …
We study the Kronecker product regression problem, in which the design matrix is a Kronecker product of two or more matrices. Formally, given $ A_i\in\R^{n_i\times d_i} $ for …
JK Behne, G Reeves - International Conference on Artificial …, 2022 - proceedings.mlr.press
Low-rank matrix recovery problems involving high-dimensional and heterogeneous data appear in applications throughout statistics and machine learning. The contribution of this …
Q Yan, J Ye, X Shen - The Journal of Machine Learning Research, 2015 - jmlr.org
In multi-response regression, pursuit of two different types of structures is essential to battle the curse of dimensionality. In this paper, we seek a sparsest decomposition representation …
Kernel methods are powerful tools for modeling nonlinear data. However, the amount of computation and memory required for kernel methods becomes the bottleneck when dealing …
T Lancewicki - arXiv preprint arXiv:1707.06156, 2017 - arxiv.org
The kernel trick concept, formulated as an inner product in a feature space, facilitates powerful extensions to many well-known algorithms. While the kernel matrix involves inner …