1 minute read


Mutual Information Definitions:

“The mutual information (MI) of 2 random variables is a measure of the mutual dependence between 2 variables.” - Wikipedia

“The Mutual Information between 2 random variables is the amount of information that one gains about a random variable by observing the value of the other.” - Layman’s term

image

image

Correlation Coefficient Definitions:

“Pearson’s correlation coefficient is a measure of linear correlation between two sets of data.” - Wikipedia

image

  • As shown in Anscombe’s quartet graphical check is required when using correlation coefficient
  • Common Correlation range:
    • Small: 0.10 to 0.29
    • Medium: 0.30 to 0.49
    • Large: 0.50 to 0.10

Comparison with Correlation Coefficient

Metric MI Correlation
Intuition Measure of “knowledge” or “information” gain from knowing another variable. Quantitative measure of linear relationship between 2 variables
Value Range 0 to 1 -1 to 1
Value Interpretation Higher MI, higher importance Closer to either extreme, stronger linear relationship

References

  1. Mutual Information
  2. Correlation and Mutual Information
  3. Correlation vs Mutual Information