Kernel methods are a class of learning machines for the fast recognition of nonlinear patterns in any data set. In this paper, the applications of kernel methods for feature …
Y Li, M Yang, Z Zhang - IEEE transactions on knowledge and …, 2018 - ieeexplore.ieee.org
Recently, multi-view representation learning has become a rapidly growing direction in machine learning and data mining areas. This paper introduces two categories for multi …
We consider learning representations (features) in the setting in which we have access to multiple unlabeled views of the data for representation learning while only one view is …
F Liu, X Huang, Y Chen… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
The class of random features is one of the most popular techniques to speed up kernel methods in large-scale problems. Related works have been recognized by the NeurIPS Test …
The KeOps library provides a fast and memory-efficient GPU support for tensors whose entries are given by a mathematical formula, such as kernel and distance matrices. KeOps …
Consider the classical supervised learning problem: we are given data (yi, xi), i≤ n, with yia response and xi∈ X a covariates vector, and try to learn a model f ˆ: X→ R to predict future …
A Rudi, L Rosasco - Advances in neural information …, 2017 - proceedings.neurips.cc
We study the generalization properties of ridge regression with random features in the statistical learning framework. We show for the first time that $ O (1/\sqrt {n}) $ learning …
A Rudi, R Camoriano… - Advances in neural …, 2015 - proceedings.neurips.cc
We study Nyström type subsampling approaches to large scale kernel methods, and prove learning bounds in the statistical learning setting, where random sampling and high …
N Pham, R Pagh - Proceedings of the 19th ACM SIGKDD international …, 2013 - dl.acm.org
Approximation of non-linear kernels using random feature mapping has been successfully employed in large-scale data analysis applications, accelerating the training of kernel …