Efficient sampling for Gaussian graphical models via spectral sparsification

D Cheng, Y Cheng, Y Liu, R Peng… - … on Learning Theory, 2015 - proceedings.mlr.press
Motivated by a sampling problem basic to computational statistical inference, we develop a
toolset based on spectral sparsification for a family of fundamental problems involving …

Sparse Gaussian Graphical Models with Discrete Optimization: Computational and Statistical Perspectives

K Behdin, W Chen, R Mazumder - arXiv preprint arXiv:2307.09366, 2023 - arxiv.org
We consider the problem of learning a sparse graph underlying an undirected Gaussian
graphical model, a key problem in statistical machine learning. Given $ n $ samples from a …

Graph sparsification approaches for laplacian smoothing

V Sadhanala, YX Wang… - Artificial Intelligence and …, 2016 - proceedings.mlr.press
Given a statistical estimation problem where regularization is performed according to the
structure of a large, dense graph G, we consider fitting the statistical estimate using a\it …

Spectral sparsification of random-walk matrix polynomials

D Cheng, Y Cheng, Y Liu, R Peng, SH Teng - arXiv preprint arXiv …, 2015 - arxiv.org
We consider a fundamental algorithmic question in spectral graph theory: Compute a
spectral sparsifier of random-walk matrix-polynomial $$ L_\alpha (G)= D-\sum_ {r= 1} …

A unifying framework for spectrum-preserving graph sparsification and coarsening

G Bravo Hermsdorff… - Advances in Neural …, 2019 - proceedings.neurips.cc
Abstract How might one``reduce''a graph? That is, generate a smaller graph that preserves
the global structure at the expense of discarding local details? There has been extensive …

[图书][B] Randomized primitives for linear algebra and applications

A Zouzias - 2013 - library-archives.canada.ca
The present thesis focuses on the design and analysis of randomized algorithms for
accelerating several linear algebraic tasks. In particular, we develop simple, efficient …

Near-optimal entrywise sampling of numerically sparse matrices

V Braverman, R Krauthgamer… - … on Learning Theory, 2021 - proceedings.mlr.press
Many real-world data sets are sparse or almost sparse. One method to measure this for a
matrix $ A\in\mathbb {R}^{n\times n} $ is the\emph {numerical sparsity}, denoted $\mathsf …

Improved large-scale graph learning through ridge spectral sparsification

D Calandriello, A Lazaric, I Koutis… - … on Machine Learning, 2018 - proceedings.mlr.press
The representation and learning benefits of methods based on graph Laplacians, such as
Laplacian smoothing or harmonic function solution for semi-supervised learning (SSL), are …

Nearly linear row sampling algorithm for quantile regression

Y Li, R Wang, L Yang, H Zhang - … Conference on Machine …, 2020 - proceedings.mlr.press
We give a row sampling algorithm for the quantile loss function with sample complexity
nearly linear in the dimensionality of the data, improving upon the previous best algorithm …

Learning non-Gaussian graphical models via Hessian scores and triangular transport

R Baptista, R Morrison, O Zahm, Y Marzouk - Journal of Machine Learning …, 2024 - jmlr.org
Undirected probabilistic graphical models represent the conditional dependencies, or
Markov properties, of a collection of random variables. Knowing the sparsity of such a …