We propose novel statistics which maximise the power of a two-sample test based on the Maximum Mean Discrepancy (MMD), byadapting over the set of kernels used in defining it …
C Xu, X Cheng, Y Xie - Advances in Neural Information …, 2024 - proceedings.neurips.cc
Normalizing flow is a class of deep generative models for efficient sampling and likelihood estimation, which achieves attractive performance, particularly in high dimensions. The flow …
Two-sample tests are important in statistics and machine learning, both as tools for scientific discovery as well as to detect distribution shifts. This led to the development of many …
A Ozier-Lafontaine, C Fourneaux, G Durif, P Arsenteva… - Genome Biology, 2024 - Springer
Single-cell technologies offer insights into molecular feature distributions, but comparing them poses challenges. We propose a kernel-testing framework for non-linear cell-wise …
We investigate properties of goodness-of-fit tests based on the Kernel Stein Discrepancy (KSD). We introduce a strategy to construct a test, called KSDAgg, which aggregates …
In nonparametric independence testing, we observe iid data {(Xi, Yi)} ni= 1, where X∈ Χ, Y∈ Y lie in any general spaces, and we wish to test the null that X is independent of Y …
X Liu, AB Duncan, A Gandy - International Conference on …, 2023 - proceedings.mlr.press
Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely used in goodness- of-fit tests. It can be applied even when the target distribution has an unknown normalising …