Mutual information is useful in various data processing tasks such as feature selection or independent component analysis. In this paper, we propose a new method of approximating …
G Doquire, M Verleysen - International Conference on Pattern …, 2012 - scitepress.org
Mutual Information estimation is an important task for many data mining and machine learning applications. In particular, many feature selection algorithms make use of the …
I Braga - Journal of Information and Data Management, 2014 - periodicos.ufmg.br
Mutual Information (MI) estimation is an important component of several data mining tasks (eg feature selection). In classification settings, MI estimation essentially depends on the …
E Schaffernicht, R Kaltenhaeuser, SS Verma… - … Conference on Artificial …, 2010 - Springer
Mutual Information (MI) is a powerful concept from information theory used in many application fields. For practical tasks it is often necessary to estimate the Mutual Information …
T Sakai, M Sugiyama - IEICE TRANSACTIONS on Information and …, 2014 - search.ieice.org
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence between random variables. The sample-based SMI approximator called least-squares …
RJ May, GC Dandy, HR Maier… - The 2006 IEEE …, 2006 - ieeexplore.ieee.org
Recently, mutual information (MI) has become widely recognized as a statistical measure of dependence that is suitable for applications where data are non-Gaussian, or where the …
In classification, feature selection is an important pre-processing step to simplify the dataset and improve the data representation quality, which makes classifiers become better, easier …
Abstract Currently Mutual Information has been widely used in pattern recognition and feature selection problems. It may be used as a measure of redundancy between features as …
We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct …