Symmetric Relative Mutual Information


Symmetric Relative Mutual Information computes the percentage of information gained by observing XX and YY:

ISR(X,Y)=I(X,Y)H(X)ร—H(Y){I_{SR}}(X,Y) = \frac{{I(X,Y)}}{{\sqrt {H(X) \times H(Y)} }}

This normalization is calculated similarly to Pearson's Correlation Coefficient ฯ\rho.

ฯX,Y=cov(X,Y)ฯƒX2ฯƒY2{\rho _{X,Y}} = \frac{{{\mathop{\rm cov}} (X,Y)}}{{\sqrt {\sigma _X^2\sigma _Y^2} }}

where ฯƒ2\sigma ^2 denotes variance.

So, Mutual Information is comparable to covariance, and Entropy is analogous to variance.


For a given network, BayesiaLab can report the Symmetric Relative Mutual Information in several contexts:

  • Main Menu > Analysis > Report > Relationship Analysis:

  • The Symmetric Normalized Mutual Information can also be shown by selecting Main Menu > Analysis > Visual > Overall > Arc > Mutual Information and then clicking the Show Arc Comments icon or selecting Main Menu > View > Show Arc Comments.

  • Note that the corresponding options under Preferences > Analysis > Visual Analysis > Arc's Mutual Information Analysis have to be selected first:

  • In Preferences, Child refers to the Relative Mutual Information from the Parent onto the Child node, i.e., in the direction of the arc.

  • Conversely, Parent refers to the Relative Mutual Information from the Child onto the Parent node, i.e., in the opposite direction of the arc.

Last updated


Bayesia USA

Bayesia S.A.S.

Bayesia Singapore

Copyright ยฉ 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.