Section outline

  • The module introduces learning in probabilistic models.  We will discuss fundamental algoritms and concepts, including Expectation-Maximization, sampling and variational approximations, and we will study relevant models from the three fundamental paradigms of probabilistic learning, namely Bayesian networks, Markov networks and dynamic models.  Models covered include: Hidden Markov Models, Markov Random Fields, Boltzmann Machines,  Latent topic models.

      Date Topic References
    Additional Material
    7

    04/03/2026
    (16-18)

    Learning with fully observed variables

    learning as inference;  flavors of probabilistic learning; Maximum Likelihood learning with fully observed variables; Naïve Bayes

    [BRML] Sect. 9.1.1-9.1.1.3, 9.3, 10.1, 10.2 A dedicated chapter to deepen knowledge on fitting distributions by ML or MAP.
    8

    05/03/2026
    (11-13)

    Learning with hidden variables

    Latent/hidden variable models, maximum likelihood learning with latent variables; 
    Expectation-Maximization algorithm; exact learning in mixture models

    [BRML] Sect. 11.1 (learning with latent variables)

    [BRML] 20.1, 20.2.1, 20.3 (mixture models)

     
    9

    10/03/2026
    (11-13)

    Hidden Markov Models  - Part I
    lgenerative models for sequential data;  inference problems on sequential data; forward-backward algorithm;  

    [BRML] Sect. 23.1.0 (Markov Models) 

    [BRML] Sect. 23.2.0-23.2.4 (HMM and forward backward) 

    Additional Readings
    [1]  A classical tutorial introduction to HMMs
    10

    11/03/2026
    (16-18)

    Hidden Markov Models  - Part II

    EM learning in HMMs; Viterbi algorithm; advanced models

    [BRML] Sect. 23.3.1-23.3.4 (EM and learning)

    [BRML] Sect. 23.2.6 (Viterbi)

    Software

    HMMLearn - Scikit-like library for HMMs

    HMMS - discrete and continuous time HMMs

    11

    12/03/2026
    (11-13)

    Variational Inference

    learning and inference in intractable latent variable models; expectation lower-bound; generalized expectation maximization

    BRML] Sect. 11.2.1 (Variational EM)

     
    12

    17/03/2026
    (11-13)

    Latent Dirichlet Allocation (LDA)

    latent topic models; probabilities as random variables; Dirichlet distribution; LDA learning by variational inference; LDA applications

    [BRML] Sect. 20.4-20.6.1  (LDA)

    Additional Readings
    [2] LDA foundation paper 
    [3] A gentle introduction to latent topic models

    Sofware
    13

    18/03/2026
    (16-18)

    Sampling methods

    sampling fundamentals; ancestral sampling; Gibbs Sampling; approximated LDA parameter learning via sampling 

    [BRML] Sect. 27.1 (sampling), Sect. 27.2 (ancestral sampling), Sect. 27.3 (Gibbs sampling)

    Additional Readings
    [4] A step-by-step derivation of collapsed Gibbs sampling for LDA
    14

    19/03/2026
    (11-13)

    Markov Random Fields 

    learning in undirected graphical models; conditional random fields restricted Boltzmann machine; contrastive divergence and Gibbs sampling in use

    [BRML] Sect. 4.2.2, 4.2.5 (MRF)

    [BRML] Sect. 4.4 (Factor Graphs)

    [BRML] Sect. 5.1.1 (Variable Elimination and Inference on Chain) 

    [BRML] Sect. 9.6.0, 9.6.1, 9.6.4, 9.6.5 (Learning in MRF/CRF)

    Additional Readings
    [5] A clean and clear introduction to RBM from its author