Accelerated attributed network embedding

X Huang, J Li, X Hu - Proceedings of the 2017 SIAM international conference …, 2017 - SIAM
Network embedding is to learn low-dimensional vector representations for nodes in a
network. It has shown to be effective in a variety of tasks such as node classification and link …

Low-rank tensor networks for dimensionality reduction and large-scale optimization problems: Perspectives and challenges part 1

A Cichocki, N Lee, IV Oseledets, AH Phan… - arXiv preprint arXiv …, 2016 - arxiv.org
Machine learning and data mining algorithms are becoming increasingly important in
analyzing large volume, multi-relational and multi--modal datasets, which are often …

Optimal block-wise asymmetric graph construction for graph-based semi-supervised learning

Z Song, Y Zhang, I King - Advances in Neural Information …, 2024 - proceedings.neurips.cc
Graph-based semi-supervised learning (GSSL) serves as a powerful tool to model the
underlying manifold structures of samples in high-dimensional spaces. It involves two …

Minimum-distortion embedding

A Agrawal, A Ali, S Boyd - Foundations and Trends® in …, 2021 - nowpublishers.com
We consider the vector embedding problem. We are given a finite set of items, with the goal
of assigning a representative vector to each one, possibly under some constraints (such as …

Poisson learning: Graph based semi-supervised learning at very low label rates

J Calder, B Cook, M Thorpe… - … Conference on Machine …, 2020 - proceedings.mlr.press
We propose a new framework, called Poisson learning, for graph based semi-supervised
learning at very low label rates. Poisson learning is motivated by the need to address the …

Sobolev gan

Y Mroueh, CL Li, T Sercu, A Raj, Y Cheng - arXiv preprint arXiv …, 2017 - arxiv.org
We propose a new Integral Probability Metric (IPM) between distributions: the Sobolev IPM.
The Sobolev IPM compares the mean discrepancy of two distributions for functions (critic) …

Analysis of -Laplacian Regularization in Semisupervised Learning

D Slepcev, M Thorpe - SIAM Journal on Mathematical Analysis, 2019 - SIAM
We investigate a family of regression problems in a semisupervised setting. The task is to
assign real-valued labels to a set of n sample points provided a small training subset of N …

The game theoretic p-Laplacian and semi-supervised learning with few labels

J Calder - Nonlinearity, 2018 - iopscience.iop.org
We study the game theoretic p-Laplacian for semi-supervised learning on graphs, and show
that it is well-posed in the limit of finite labeled data and infinite unlabeled data. In particular …

Global linear and local superlinear convergence of IRLS for non-smooth robust regression

L Peng, C Kümmerle, R Vidal - Advances in neural …, 2022 - proceedings.neurips.cc
We advance both the theory and practice of robust $\ell_p $-quasinorm regression for $ p\in
(0, 1] $ by using novel variants of iteratively reweighted least-squares (IRLS) to solve the …

Consistency of Lipschitz learning with infinite unlabeled data and finite labeled data

J Calder - SIAM Journal on Mathematics of Data Science, 2019 - SIAM
We study the consistency of Lipschitz learning on graphs in the limit of infinite unlabeled
data and finite labeled data. Previous work has conjectured that Lipschitz learning is well …