CCMI: Classifier based conditional mutual information estimation

S Mukherjee, H Asnani… - Uncertainty in artificial …, 2020 - proceedings.mlr.press
Abstract Conditional Mutual Information (CMI) is a measure of conditional dependence
between random variables X and Y, given another random variable Z. It can be used to …

Estimating conditional mutual information for discrete-continuous mixtures using multi-dimensional adaptive histograms

A Marx, L Yang, M van Leeuwen - Proceedings of the 2021 SIAM international …, 2021 - SIAM
Estimating conditional mutual information (CMI) is an essential yet challenging step in many
machine learning and data mining tasks. Estimating CMI from data that contains both …

Beyond normal: On the evaluation of mutual information estimators

P Czyż, F Grabowski, J Vogt… - Advances in Neural …, 2024 - proceedings.neurips.cc
Mutual information is a general statistical dependency measure which has found
applications in representation learning, causality, domain generalization and computational …

Model-powered conditional independence test

R Sen, AT Suresh, K Shanmugam… - Advances in neural …, 2017 - proceedings.neurips.cc
We consider the problem of non-parametric Conditional Independence testing (CI testing)
for continuous random variables. Given iid samples from the joint distribution $ f (x, y, z) $ of …

Estimating mutual information by local Gaussian approximation

S Gao, GV Steeg, A Galstyan - arXiv preprint arXiv:1508.00536, 2015 - arxiv.org
Estimating mutual information (MI) from samples is a fundamental problem in statistics,
machine learning, and data analysis. Recently it was shown that a popular class of non …

Neural methods for point-wise dependency estimation

YHH Tsai, H Zhao, M Yamada… - Advances in …, 2020 - proceedings.neurips.cc
Since its inception, the neural estimation of mutual information (MI) has demonstrated the
empirical success of modeling expected dependency between high-dimensional random …

[HTML][HTML] Testing conditional independence in supervised learning algorithms

DS Watson, MN Wright - Machine Learning, 2021 - Springer
We propose the conditional predictive impact (CPI), a consistent and unbiased estimator of
the association between one or several features and a given outcome, conditional on a …

Conditional independence testing based on a nearest-neighbor estimator of conditional mutual information

J Runge - … Conference on Artificial Intelligence and Statistics, 2018 - proceedings.mlr.press
Conditional independence testing is a fundamental problem underlying causal discovery
and a particularly challenging task in the presence of nonlinear dependencies. Here a fully …

Sliced mutual information: A scalable measure of statistical dependence

Z Goldfeld, K Greenewald - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Mutual information (MI) is a fundamental measure of statistical dependence, with a myriad of
applications to information theory, statistics, and machine learning. While it possesses many …

Conditional independence testing under misspecified inductive biases

F Maia Polo, Y Sun, M Banerjee - Advances in Neural …, 2023 - proceedings.neurips.cc
Conditional independence (CI) testing is a fundamental and challenging task in modern
statistics and machine learning. Many modern methods for CI testing rely on powerful …