Finite sample based mutual information

K Rajab, F Kamalov - IEEE Access, 2021 - ieeexplore.ieee.org
Mutual information is a popular metric in machine learning. In case of a discrete target
variable and a continuous feature variable the mutual information can be calculated as a …

Approximating mutual information by maximum likelihood density ratio estimation

T Suzuki, M Sugiyama, J Sese… - New challenges for …, 2008 - proceedings.mlr.press
Mutual information is useful in various data processing tasks such as feature selection or
independent component analysis. In this paper, we propose a new method of approximating …

A comparison of multivariate mutual information estimators for feature selection

G Doquire, M Verleysen - International Conference on Pattern …, 2012 - scitepress.org
Mutual Information estimation is an important task for many data mining and machine
learning applications. In particular, many feature selection algorithms make use of the …

A constructive density-ratio approach to mutual information estimation: experiments in feature selection

I Braga - Journal of Information and Data Management, 2014 - periodicos.ufmg.br
Mutual Information (MI) estimation is an important component of several data mining tasks
(eg feature selection). In classification settings, MI estimation essentially depends on the …

On estimating mutual information for feature selection

E Schaffernicht, R Kaltenhaeuser, SS Verma… - … Conference on Artificial …, 2010 - Springer
Mutual Information (MI) is a powerful concept from information theory used in many
application fields. For practical tasks it is often necessary to estimate the Mutual Information …

Computationally efficient estimation of squared-loss mutual information with multiplicative kernel models

T Sakai, M Sugiyama - IEICE TRANSACTIONS on Information and …, 2014 - search.ieice.org
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence
between random variables. The sample-based SMI approximator called least-squares …

Critical values of a kernel density-based mutual information estimator

RJ May, GC Dandy, HR Maier… - The 2006 IEEE …, 2006 - ieeexplore.ieee.org
Recently, mutual information (MI) has become widely recognized as a statistical measure of
dependence that is suitable for applications where data are non-Gaussian, or where the …

Mutual information for feature selection: estimation or counting?

HB Nguyen, B Xue, P Andreae - Evolutionary Intelligence, 2016 - Springer
In classification, feature selection is an important pre-processing step to simplify the dataset
and improve the data representation quality, which makes classifiers become better, easier …

A mutual information estimator for continuous and discrete variables applied to feature selection and classification problems

F Coelho, AP Braga, M Verleysen - International Journal of …, 2016 - Taylor & Francis
Abstract Currently Mutual Information has been widely used in pattern recognition and
feature selection problems. It may be used as a measure of redundancy between features as …

Quadratic mutual information feature selection

D Sluga, U Lotrič - Entropy, 2017 - mdpi.com
We propose a novel feature selection method based on quadratic mutual information which
has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct …