Section outline
-
The module introduces learning in probabilistic models. We will discuss fundamental algoritms and concepts, including Expectation-Maximization, sampling and variational approximations, and we will study relevant models from the three fundamental paradigms of probabilistic learning, namely Bayesian networks, Markov networks and dynamic models. Models covered include: Hidden Markov Models, Markov Random Fields, Boltzmann Machines, Latent topic models.
Date Topic References Additional Material 7 04/03/2026
(16-18)Learning with fully observed variables
learning as inference; flavors of probabilistic learning; Maximum Likelihood learning with fully observed variables; Naïve Bayes
8 05/03/2026
(11-13)Learning with hidden variables
Latent/hidden variable models, maximum likelihood learning with latent variables;
Expectation-Maximization algorithm; exact learning in mixture models