3-Day Introductory BayesiaLab Course in New York City
Regus, 111 W. 19th Street, 6th Floor, New York, NY 10011
October 24–26, 2017, 9 a.m. to 5 p.m. each day
Go beyond descriptive analytics and enter the realm of probabilistic and causal reasoning with Bayesian networks. Learn all about designing and machine-learning Bayesian networks with BayesiaLab (see complete course program)
This highly acclaimed course gives you a comprehensive introduction that allows you to employ Bayesian networks for applied research across many fields, such a biostatistics, decision science, econometrics, ecology, marketing science, sensory research, sociology, just to name a few.
The hallmark of this three-day course is that every segment on theory is immediately followed by a corresponding practice session using BayesiaLab. Thus, you have the opportunity to implement on your computer what the instructor just presented in his lecture. This includes knowledge modeling, probabilistic reasoning, causal inference, machine learning, probabilistic structural equation models, plus many more examples. Given the strictly limited class size, the instructor is always available to coach you one-on-one as you progress through the exercises.
After the end of the course, you can continue your studies as you will have access to a full 60-day license of BayesiaLab Professional. Additionally, two workbooks, plus numerous datasets and sample networks help you to experiment independently with Bayesian networks. To date, over 750 researchers from all over the world have taken this course (see testimonials). For most of them, Bayesian networks and BayesiaLab have become crucial tools in their research.
Day 1: Theoretical Introduction
- Bayesian networks for association analysis and causal analysis
- Bayesian networks for knowledge modeling and data mining
- How do Bayesian networks fit into the world of research and analytics?
- A timeline of Bayesian networks, from Bayes’ Theorem to Judea Pearl winning the Turing Award.
- Timeline of Bayesia S.A.S.
Examples for Probabilistic Reasoning
- Interpreting results of medical tests
- Kahneman & Tversky’s Yellow Cab/White Cab example
- The Monty Hall Problem, solving a vexing puzzle with a Bayesian network
- The Kingborn High School
- Simpson’s Paradox: Observational Inference vs Causal Inference
- Probabilistic axioms
- Probability interpretation
- Probabilistic inference
- Particle interpretation
- Observation versus intervention
- Observational inference
- Joint probability distribution (JPD)
- JPD - particle interpretation
- Leveraging independence properties
- Product/chain rule for compact representation of JPD
- Qualitative part: structure
- Graph terminology
- Dependencies and independencies
- Information flow
- Quantitative part: parameters
- Inference in Bayesian networks
- Exact inference
- Approximate inference
- Example of probabilistic inference: alarm
Building Bayesian Networks Manually
- Brainstorming workflow
- Structural modeling
- Parametrical modeling
- Bayesia Expert Knowledge Elicitation Environment
Day 2: Machine Learning—Part 1
- Maximum Likelihood
- Introduction of Prior Knowledge
- Smooth Probability Estimation (Laplacian correction)
- Information as a measurable quantity
- Conditional Entropy
- Mutual Information
- Kullback-Leibler Divergence
Unsupervised Structural Learning
- Minimum Description Length (MDL) score
- Structural Coefficient
- Minimum size of dataset
- Search Spaces
- Search Strategies
- Learning algorithms
- Maximum Weight Spanning Tree
- Taboo Search
- Taboo Order
- Data Perturbation
- Example: Dominick’s
- Data Import (Typing, Discretization)
- Dictionary of node comments
- Exclusion of a node
- Heuristic Search Algorithms
- Data Perturbation (Learning, Bootstrap)
- Choice of the Structural Coefficient
- Symmetric Layout
- Analysis of the model (Arc Force, Node Force, Pearson Coefficient)
- Distance Mapping
- Forbidden Arcs
- Manual Connections
- Learning Algorithms
- Augmented Naive
- Manual Augmented Naive
- Tree-Augment Naive
- Sons & Spouse
- Markov Blanket
- Augmented Markov Blanket
- Minimal Augmented Markov Blanket
- Example: Microarray Analysis
- Data Import (Transpose, Row Identifier, Data Type, Not Distributed, Decision Tree Discretization)
- Target Node
- Heuristic Search Algorithms
- Targeted Evaluation (In-Sample, Out-of-Sample: K-Fold, Data Perturbation, Test Set)
- Smoothed Probability Estimation
- Feature Selection
- Analysis of the Model (Monitors, Mapping, Target Report, Influence Analysis, Target Sensitivity Analysis, Target Mean Analysis, Target Interpretation Tree)
- Evidence Scenario File
- Interactive Inference
- Adaptive Questionnaire
- Batch Labeling
Day 3: Machine Learning—Part 2
Semi-Supervised Learning—Variable Clustering
- Example: S&P 500 Analysis
- Variable Clustering
- Changing the number of Clusters
- Dynamic Dendrogram
- Dynamic Mapping
- Manual Modification of Clusters
- Manual Creation of Clusters
- Semi-Supervised Learning
- Search Tool
- Variable Clustering
- Synthesis of a Latent Variable
- Ordered Numerical Values
- Cluster Purity
- Cluster Mapping
- Contingency Table Fit
- Hypercube Cells Per State
- Example: Dominick’s Finer Food
- Data Clustering (Algorithm, Numerical States)
- Quality Metrics
- Invert Node Selection
- Set a Target State
- Sort the Monitors by Target Value Correlation
- Conditional Mean Analysis (means, delta-means, radars)
- Target Dynamic Profile
- Target Optimization Tree
- Projection of the Cluster on other Variables
- Data Association
- Save Internal Dataset
Probabilistic Structural Equation Models
- PSEM Workflow
- Unsupervised Structural Learning
- Variable Clustering
- Data Clustering for each Cluster of Manifest Variables
- Connection of the Target Variable to the Factors
- Unsupervised Learning for Discovering the Path
- Example: The Perfume Market in France
- PSEM Workflow
- Displayed Classes
- Total Effects
- Direct Effects
- Direct Effect Contributions
- Multiple Clustering
- Structure Comparison Tool
- Dictionaries (Arcs, Costs)
- Fixed Arcs
- Taboo and Arc Constraints
- Export Variations
- Select Evidence Set
About the Instructor
Dr. Lionel Jouffe is co-founder and CEO of France-based Bayesia S.A.S. Lionel holds a Ph.D. in Computer Science from the University of Rennes and has been working in the field of Artificial Intelligence since the early 1990s. While working as a Professor/Researcher at ESIEA, Lionel started exploring the potential of Bayesian networks. After co-founding Bayesia in 2001, he and his team have been working full-time on the development BayesiaLab, which has since emerged as the leading software package for knowledge discovery, data mining and knowledge modeling using Bayesian networks. BayesiaLab enjoys broad acceptance in academic communities as well as in business and industry.
Who should attend?
Applied researchers, statisticians, data scientists, data miners, decision scientists, biologists, ecologists, environmental scientists, epidemiologists, predictive modelers, econometricians, economists, market researchers, knowledge managers, marketing scientists, operations researchers, social scientists, students and teachers in related fields.
- Basic data manipulation skills, e.g. with Excel.
- No prior knowledge of Bayesian networks is required.
- No programming skills are required. You will use the graphical user interface of BayesiaLab for all exercises.
For a general overview of this field of study, we suggest that you download a free copy of our book, Bayesian Networks & BayesiaLab. Although by no means mandatory, reading its first three chapters would be an excellent preparation for the course.
Terms & Conditions
- You may cancel your registration for a full refund of the course fees up to 30 days before the start of the course. If you cancel within 30 days of the event, your course fee will not be refunded. However, you will be able to apply 100% of the paid course fees towards future BayesiaLab courses.
- A 60-day license to the full version of BayesiaLab Professional Edition will be provided to all participants for installation on their computers prior to the event.
- Participants will be required to bring their own WiFi-enabled computer/laptop to the seminar (Windows XP, Vista, 7, 8, 10 or Mac OS X).
- The course fee includes all training materials.
- Accommodation is at the participants' own expense.
"I would absolutely recommend this course as a thorough and in-depth introduction to Bayesian Networks and the BayesiaLab package. The small class sizes also contributed to an enjoyable and engaging learning experience."—Brian Potter, Infotools (Introductory Course in Melbourne, November 2015).
"This is one of the best
"Overall, this training was outstanding. Lionel is a gifted teacher, and it helps that you are showcasing a
“A must-take course for anyone looking to leverage advanced Bayesian network techniques in virtually any domain.”—Alex Cosmas, Chief Scientist, Booz Allen Hamilton (Introductory Course in Los Angeles, June 2011).
“The BayesiaLab software is impressive in its sophistication and multi-faceted abilities as a decision support tool. I had been using it primarily as a modeling tool for deductive analysis. Taking this class opened my eyes to BayesiaLab's incredible data-mining abilities. If you are looking for something that will provide a totally new angle on business decision problems, this is it!”—Michael Ryall,
"This class can only be described as eye-opening, the tool as terrific. Some of the best instruction for the shortest period of time I’ve ever received. A seriously terrific job.” —Beau Martin, President of American Choice Modeling (Introductory Course in Chicago, July 2013).