Mutual Information Definitions:

“The mutual information (MI) of 2 random variables is a measure of the mutual dependence between 2 variables.” - Wikipedia

“The Mutual Information between 2 random variables is the amount of information that one gains about a random variable by observing the value of the other.” - Layman’s term

image

image

Correlation Coefficient Definitions:

“Pearson’s correlation coefficient is a measure of linear correlation between two sets of data.” - Wikipedia

image


Comparison with Correlation Coefficient

Metric MI Correlation
Intuition Measure of “knowledge” or “information” gain from knowing another variable. Quantitative measure of linear relationship between 2 variables
Value Range 0 to 1 -1 to 1
Value Interpretation Higher MI, higher importance Closer to either extreme, stronger linear relationship

References

  1. Mutual Information
  2. Correlation and Mutual Information
  3. Correlation vs Mutual Information