Key Concepts
Context
- In BayesiaLab, nearly all learning and analysis functions are based on principles and metrics from the field of Information Theory.
- In this section, we summarize some of these concepts and attempt to relate them to the corresponding BayesiaLab functions.
- Furthermore, we include several relevant statistical concepts for understanding BayesiaLab's estimates and visualizations.
Content
- Entropy
- Log-Loss
- Maximum-Likelihood Estimation
- Minimum Description Length Score (MDL Score)
- Mutual-Information
- Normalized Mutual Information
- Bayes Rule and Bayes Theorem
- Conditional Probability Table-
- Contingency Table Fit
- Deviance
- Entropy
- Hellinger-Distance
- Information Gain
- Joint Probability and Joint Probability Distribution
- Kullback-Leibler Divergence and Arc Force
- Latent Variables, Factors, and Hidden Nodes vs. Manifest Variables
- Markov Blanket
- Means and Values of Nodes
- Pearson Correlation
- Structural Coefficient
- Total Effect