BayesiaLab
Kullback-Leibler Divergence & Arc Force

Kullback-Leibler Divergence (Arc Force)

Definition

  • In BayesiaLab, the Kullback-Leibler Divergence (or KL Divergence) is used to measure the strength of the relationship between two nodes that are directly connected by an arc.
  • We commonly refer to the KL Divergence also as Arc Force.
  • Formally, the Kullback-Leibler Divergence DKLD_KL measures the difference between two distributions PP and QQ.
DKL(P(X)Q(X))=XP(X)log2P(X)Q(X)D_{KL}(P({\cal X})\|Q({\cal X}))=\sum_{\cal X}P({\cal X})log_2\frac{P({\cal X})}{Q({\cal X})}
  • For our purposes, we consider PP the Bayesian network that does include the arc for which we wish to compute the Arc Force, and QQ the Bayesian network that does not contain that arc but is otherwise identical.
  • We interpret this difference DKLD_KL as the "force of the arc" or Arc Force.
  • Note that Filtered Values are taken into account for computing the Arc Force.

Throughout this website, we use Kullback-Leibler Divergence, KL Divergence, and Arc Force interchangeably.


For North America

Bayesia USA

4235 Hillsboro Pike
Suite 300-688
Nashville, TN 37215, USA

+1 888-386-8383
info@bayesia.us

Head Office

Bayesia S.A.S.

Parc Ceres, Batiment N 21
rue Ferdinand Buisson
53810 Change, France

For Asia/Pacific

Bayesia Singapore

1 Fusionopolis Place
#03-20 Galaxis
Singapore 138522


Copyright © 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.