“The mutual information (MI) of 2 random variables is a measure of the mutual dependence between 2 variables.” - Wikipedia
“The Mutual Information between 2 random variables is the amount of information that one gains about a random variable by observing the value of the other.” - Layman’s term
“Pearson’s correlation coefficient is a measure of linear correlation between two sets of data.” - Wikipedia
Metric | MI | Correlation |
---|---|---|
Intuition | Measure of “knowledge” or “information” gain from knowing another variable. | Quantitative measure of linear relationship between 2 variables |
Value Range | 0 to 1 | -1 to 1 |
Value Interpretation | Higher MI, higher importance | Closer to either extreme, stronger linear relationship |
References