A Measure of Dependence Between Discrete and Continuous Variables

28 Aug 2017  ·  Ré Miguel A., Varela Guillermo G. Aguirre ·

Mutual Information (MI) is an useful tool for the recognition of mutual dependence berween data sets. Differen methods for the estimation of MI have been developed when both data sets are discrete or when both data sets are continuous. The MI estimation between a discrete data set and a continuous data set has not received so much attention. We present here a method for the estimation of MI for this last case based on the kernel density approximation. The calculation may be of interest in diverse contexts. Since MI is closely related to Jensen Shannon divergence, the method here developed is of particular interest in the problem of sequence segmentation.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Applications

Datasets


  Add Datasets introduced or used in this paper