Log-Loss
The Log-Loss
reflects the number of bits required to encode an n-dimensional piece of evidence (or observation)
given the current Bayesian network
. As a shorthand for "the number of bits required to encode," we use the term "cost" in the sense that "more bits required" means computationally "more expensive."
where
is the joint probability of the evidence
computed by the network
:
In other words, the lower the probability of
given the network
, the higher the Log-Loss
.
Note that
refers to a single piece of n-dimensional evidence, not an entire dataset.
Last modified 1mo ago