bayesia logo
BayesiaLab
Kullback-Leibler Divergence & Arc Force

Kullback-Leibler Divergence (Arc Force)

Definition

  • In BayesiaLab, the Kullback-Leibler Divergence (or KL Divergence) is used to measure the strength of the relationship between two nodes that are directly connected by an arc.
  • We commonly refer to the KL Divergence also as Arc Force.
  • Formally, the Kullback-Leibler Divergence DKLD_KL measures the difference between two distributions PP and QQ.
DKL(P(X)Q(X))=XP(X)log2P(X)Q(X)D_{KL}(P({\cal X})\|Q({\cal X}))=\sum_{\cal X}P({\cal X})log_2\frac{P({\cal X})}{Q({\cal X})}
  • For our purposes, we consider PP the Bayesian network that does include the arc for which we wish to compute the Arc Force, and QQ the Bayesian network that does not contain that arc but is otherwise identical.
  • We interpret this difference DKLD_KL as the "force of the arc" or Arc Force.
  • Note that Filtered Values are taken into account for computing the Arc Force.

Throughout this website, we use Kullback-Leibler Divergence, KL Divergence, and Arc Force interchangeably.


Copyright © 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.