Symmetric Normalized Mutual Information


The following Venn Diagram illustrates that the Mutual Information is symmetrical for the two variables XX and YY, i.e., I(X,Y)=I(Y,X)I(X,Y)=I(Y,X).

However, the variables XX and YYcan each have a different number of states. Therefore, their respective entropies can be very different.

This means that the absolute value of Mutual Information cannot be interpreted without context. In the Venn Diagram, for instance, I(XโˆฃY)I(X|Y) reduces H(Y)H(Y) by a bigger percentage than does H(X)H(X). As such, I(X,Y)I(X,Y) would be more "important" with regard to YY than it would be with regard to XX.


The Symmetric Normalized Mutual Information measure takes the difference of the respective entropies of X and Y into account:

ISN(X,Y)=2ร—I(X,Y)logโก2(SX)+logโก2(SY){I_{SN}}(X,Y) = 2 \times \frac{{I(X,Y)}}{{{{\log }_2}({S_X}) + {{\log }_2}({S_Y})}}

As a result, we have an easy-to-interpret measure that relates to both XX and YY together.


For a given network, BayesiaLab can report the Symmetric Normalized Mutual Information in several contexts:

  • Main Menu > Analysis > Report > Relationship Analysis:

  • The Symmetric Normalized Mutual Information can also be shown by selecting Main Menu > Analysis > Visual > Overall > Arc > Mutual Information and then clicking the Show Arc Comments icon or selecting Main Menu > View > Show Arc Comments.

  • Note that the corresponding options under Preferences > Analysis > Visual Analysis > Arc's Mutual Information Analysis have to be selected first:

  • In Preferences, Child refers to the Relative Mutual Information from the Parent onto the Child node, i.e., in the direction of the arc.

  • Conversely, Parent refers to the Relative Mutual Information from the Child onto the Parent node, i.e., in the opposite direction of the arc.

Last updated


Bayesia USA

Bayesia S.A.S.

Bayesia Singapore

Copyright ยฉ 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.