Skip to Content
BayesiaLabKey ConceptsMutual InformationSymmetric Relative Mutual Information

Symmetric Relative Mutual Information

Definition

Symmetric Relative Mutual Information computes the percentage of information gained by observing XX and YY:

ISR(X,Y)=I(X,Y)H(X)H(Y)\displaystyle I_{SR}(X,Y) = \frac{I(X,Y)}{\sqrt{H(X)\,H(Y)}}

This normalization is calculated similarly to Pearson’s Correlation Coefficient ρ\rho.

ρX,Y=cov(X,Y)σX2σY2\displaystyle \rho_{X,Y} = \frac{\operatorname{cov}(X,Y)}{\sqrt{\sigma_X^2\,\sigma_Y^2}}

where σ2\sigma^2 denotes variance.

So, Mutual Information is comparable to covariance, and Entropy is analogous to variance.

Usage

For a given network, BayesiaLab can report the Symmetric Relative Mutual Information in several contexts:

  • Select Menu > Analysis > Report > Relationship Analysis:
  • The Symmetric Relative Mutual Information can also be shown by selecting Menu > Analysis > Visual > Overall > Arc > Mutual Information and then clicking the Show Arc Comments icon or selecting Menu > View > Show Arc Comments.
Loading SVG...
Click to Zoom
Loading SVG...
  • Note that the corresponding options under Preferences > Analysis > Visual Analysis > Arc's Mutual Information Analysis have to be selected first:
Loading SVG...
Click to Zoom
Loading SVG...
  • In Preferences, “Child” refers to the Relative Mutual Information from the Parent onto the Child node, i.e., in the direction of the arc, always shown in blue.
  • Conversely, “Parent” refers to the Relative Mutual Information from the Child onto the Parent node, i.e., in the opposite direction of the arc, always shown in red.
  • Symmetric metrics are shown in black.