# Symmetric Relative Mutual Information

### Definition

Symmetric Relative Mutual Information computes the percentage of information gained by observing $X$ and $Y$:

${I_{SR}}(X,Y) = \frac{{I(X,Y)}}{{\sqrt {H(X) \times H(Y)} }}$

This normalization is calculated similarly to Pearson's Correlation Coefficient $\rho$.

${\rho _{X,Y}} = \frac{{{\mathop{\rm cov}} (X,Y)}}{{\sqrt {\sigma _X^2\sigma _Y^2} }}$

where $\sigma ^2$ denotes variance.

So, Mutual Information is comparable to covariance, and Entropy is analogous to variance.

### Usage

For a given network, BayesiaLab can report the Symmetric Relative Mutual Information in several contexts:

• Main Menu > Analysis > Report > Relationship Analysis:

• The Symmetric Normalized Mutual Information can also be shown by selecting Main Menu > Analysis > Visual > Overall > Arc > Mutual Information and then clicking the Show Arc Comments icon or selecting Main Menu > View > Show Arc Comments.

• Note that the corresponding options under Preferences > Analysis > Visual Analysis > Arc's Mutual Information Analysis have to be selected first:

• In Preferences, Child refers to the Relative Mutual Information from the Parent onto the Child node, i.e., in the direction of the arc.

• Conversely, Parent refers to the Relative Mutual Information from the Child onto the Parent node, i.e., in the opposite direction of the arc.

Last updated

Bayesia USA

info@bayesia.us

Bayesia S.A.S.

info@bayesia.com

Bayesia Singapore

info@bayesia.com.sg