Distributed learning with regularized least squares

SB Lin, X Guo, DX Zhou - Journal of Machine Learning Research, 2017 - jmlr.org
We study distributed learning with the least squares regularization scheme in a reproducing
kernel Hilbert space (RKHS). By a divide-and-conquer approach, the algorithm partitions a …

Learning theory of distributed spectral algorithms

ZC Guo, SB Lin, DX Zhou - Inverse Problems, 2017 - iopscience.iop.org
Spectral algorithms have been widely used and studied in learning theory and inverse
problems. This paper is concerned with distributed spectral algorithms, for handling big data …

Distributed semi-supervised learning with kernel ridge regression

X Chang, SB Lin, DX Zhou - Journal of Machine Learning Research, 2017 - jmlr.org
This paper provides error analysis for distributed semi-supervised learning with kernel ridge
regression (DSKRR) based on a divide-and-conquer strategy. DSKRR applies kernel ridge …

[HTML][HTML] Rates of approximation by ReLU shallow neural networks

T Mao, DX Zhou - Journal of Complexity, 2023 - Elsevier
Neural networks activated by the rectified linear unit (ReLU) play a central role in the recent
development of deep learning. The topic of approximating functions from Hölder spaces by …

Generalization analysis of deep CNNs under maximum correntropy criterion

Y Zhang, Z Fang, J Fan - Neural Networks, 2024 - Elsevier
Convolutional neural networks (CNNs) have gained immense popularity in recent years,
finding their utility in diverse fields such as image recognition, natural language processing …

Simple stochastic and online gradient descent algorithms for pairwise learning

Z Yang, Y Lei, P Wang, T Yang… - Advances in Neural …, 2021 - proceedings.neurips.cc
Pairwise learning refers to learning tasks where the loss function depends on a pair of
instances. It instantiates many important machine learning tasks such as bipartite ranking …

Generalization guarantee of SGD for pairwise learning

Y Lei, M Liu, Y Ying - Advances in neural information …, 2021 - proceedings.neurips.cc
Recently, there is a growing interest in studying pairwise learning since it includes many
important machine learning tasks as specific examples, eg, metric learning, AUC …

[PDF][PDF] Learning with the maximum correntropy criterion induced losses for regression.

Y Feng, X Huang, L Shi, Y Yang, JAK Suykens - J. Mach. Learn. Res., 2015 - jmlr.org
Within the statistical learning framework, this paper studies the regression model associated
with the correntropy induced losses. The correntropy, as a similarity measure, has been …

Distributed kernel-based gradient descent algorithms

SB Lin, DX Zhou - Constructive Approximation, 2018 - Springer
We study the generalization ability of distributed learning equipped with a divide-and-
conquer approach and gradient descent algorithm in a reproducing kernel Hilbert space …

Sharper generalization bounds for pairwise learning

Y Lei, A Ledent, M Kloft - Advances in Neural Information …, 2020 - proceedings.neurips.cc
Pairwise learning refers to learning tasks with loss functions depending on a pair of training
examples, which includes ranking and metric learning as specific examples. Recently, there …