Estimating conditional mutual information (CMI) is an essential yet challenging step in many machine learning and data mining tasks. Estimating CMI from data that contains both …
Mutual information is a general statistical dependency measure which has found applications in representation learning, causality, domain generalization and computational …
We consider the problem of non-parametric Conditional Independence testing (CI testing) for continuous random variables. Given iid samples from the joint distribution $ f (x, y, z) $ of …
Estimating mutual information (MI) from samples is a fundamental problem in statistics, machine learning, and data analysis. Recently it was shown that a popular class of non …
Since its inception, the neural estimation of mutual information (MI) has demonstrated the empirical success of modeling expected dependency between high-dimensional random …
We propose the conditional predictive impact (CPI), a consistent and unbiased estimator of the association between one or several features and a given outcome, conditional on a …
J Runge - … Conference on Artificial Intelligence and Statistics, 2018 - proceedings.mlr.press
Conditional independence testing is a fundamental problem underlying causal discovery and a particularly challenging task in the presence of nonlinear dependencies. Here a fully …
Z Goldfeld, K Greenewald - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Mutual information (MI) is a fundamental measure of statistical dependence, with a myriad of applications to information theory, statistics, and machine learning. While it possesses many …
Conditional independence (CI) testing is a fundamental and challenging task in modern statistics and machine learning. Many modern methods for CI testing rely on powerful …