Kullback-Leibler Divergence (Arc Force)
Definition
- In BayesiaLab, the Kullback-Leibler Divergence (or KL Divergence) is used to measure the strength of the relationship between two nodes that are directly connected by an arc.
- We commonly refer to the KL Divergence also as Arc Force.
- Formally, the Kullback-Leibler Divergence measures the difference between two distributions and .
- For our purposes, we consider the Bayesian network that does include the arc for which we wish to compute the Arc Force, and the Bayesian network that does not contain that arc but is otherwise identical.
- We interpret this difference as the "force of the arc" or Arc Force.
- Note that Filtered Values are taken into account for computing the Arc Force.
Throughout this website, we use Kullback-Leibler Divergence, KL Divergence, and Arc Force interchangeably.