BayesiaLab
Normalized Entropy

Normalized Entropy

Normalized Entropy

  • Normalized Entropy is a metric that takes into account the maximum possible value of Entropy and returns a normalized measure of the uncertainty associated with the variable:
HN(X)=H(X)log2(SX){H_N}(X) = \frac{H(X)}{log_2({S_X}) }

Example

In this new example, we now compare the variables X1 and X2, which each represent ball colors:

  • X1 ∈ {blue, red}
  • X2 ∈ {blue, red, green, yellow, purple, orange, brown, black}

Normalized Entropy allows us to compare the degree of uncertainty even though these two variables have different numbers of states, i.e., two versus eight states:

Usage

In BayesiaLab, the values of Entropy and Normalized Entropy can be accessed in a number of ways:

  • In Validation Mode , with the Information Mode activated, hovering over a Monitor with your cursor will bring up a Tooltip that includes Entropy and Normalized Entropy.
  • You can also sort the Monitors in the Monitor Panel according to their Normalized Entropy via Monitor Context Menu > Sort > Normalized Entropy.

  • The Normalized Entropy is also available as a Node Analysis metric for Size and Color in the 2D and 3D Mapping Tools.
  • In Function Nodes, Entropy and Normalized Entropy are available as Inference Functions in the Equation tab.
    • Entropy: Entropy(?X1?, False)
    • Normalized Entropy: Entropy(?X1?, True)

Demo Network

NormalizedEntropy.xbl (opens in a new tab)


For North America

Bayesia USA

4235 Hillsboro Pike
Suite 300-688
Nashville, TN 37215, USA

+1 888-386-8383
info@bayesia.us

Head Office

Bayesia S.A.S.

Parc Ceres, Batiment N 21
rue Ferdinand Buisson
53810 Change, France

For Asia/Pacific

Bayesia Singapore

1 Fusionopolis Place
#03-20 Galaxis
Singapore 138522


Copyright © 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.