Structural Coefficient


  • BayesiaLab utilizes proprietary score-based learning algorithms.

  • As opposed to the constraint-based algorithms that use independence tests for adding or removing arcs between nodes, BayesiaLab employs the Minimum Description Length Score (MDL Score) to measure the quality of candidate networks with respect to the available data.

Structural Coefficient

  • In BayesiaLab, the computation of the MDL Score also includes the so-called Structural Coefficient ฮฑ\alpha as a weighting factor for the structural component DL(B)DL(B).

  • With that, the MDL Score is calculated using the following formula:

MDL(B,D)=ฮฑร—DL(B)+DL(DโˆฃB)MDL(B,D) = \alpha \times DL(B) + DL(D|B)

  • As a result, the choice of value for the Structural Coefficient ฮฑ\alpha affects the relative weighting of the two components DL(B)DL(B) and DL(DโˆฃB)DL(D|B).

  • You can arbitrarily modify the Structural Coefficient ฮฑ\alpha within the range of 0 to 150.

  • ฮฑ=1\alpha = 1, the default value means the components DL(B)DL(B) and DL(DโˆฃB)DL(D|B) are weighted equally.

  • ฮฑ<1\alpha < 1 reduces the contribution of DL(B)DL(B) in the MDL Score formula and, thus, allows for more "structural complexity."

  • ฮฑ>1\alpha > 1 increases the contribution of DL(B)DL(B) in the MDL Score formula, i.e., it penalizes "structural complexity", forcing a simpler model.

  • There is another way to interpret the Structural Coefficient ฮฑ\alpha, which can help understand its role in learning a Bayesian network.

    • Weighting DL(B)DL(B) with a factor ฮฑ\alpha is equivalent to changing the original number of observations N in a dataset to a new number of observations Nโ€ฒ:

Nโ€ฒ=NฮฑN' = \frac{N}{\alpha }
  • An ฮฑ\alpha value of 0 would be the same as having an infinite number of observations Nโ€ฒN'. As a result, the MDL Score would only be based on the fit component of the score, i.e., DL(DโˆฃB)DL(D|B), and BayesiaLab's structural learning algorithms would produce a fully connected network.

  • At the other extreme, an ฮฑ\alpha value of 150 would massively favor the simplest possible network structures as the new equivalent number of observations Nโ€ฒN' would only 1/150th of N N.

  • It is perhaps more intuitive to consider the new number of observations Nโ€ฒ as weighted counts of the actual observations N N. For instance, ฮฑ=0.5\alpha = 0.5 is equivalent to counting all observations twice.

  • From a practical perspective, the Structural Coefficient ฮฑ\alpha can be considered a kind of "significance" threshold for structural learning.

    • The higher you set the ฮฑ\alpha value, the higher the threshold for discovering probabilistic relationships. Conversely, the lower you set the ฮฑ\alpha value, the lower the discovery threshold and the weaker probabilistic relationship would still be found and represented by an arc.

    • Reducing ฮฑ can be helpful if you have a small dataset from which you want to learn a model. Perhaps at the default value, ฮฑ=1\alpha = 1, the learning algorithm would not find any arcs.

    • However, choosing too low a value might result in "overfitting", i.e., learning "insignificant" relationships, in other words, discovering patterns in what turns out to be mere noise.

    • BayesiaLab can help reduce the risk of overfitting with the Structural Coefficient Analysis feature.

Last updated


Bayesia USA

Bayesia S.A.S.

Bayesia Singapore

Copyright ยฉ 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.