Section outline

  • The module introduces probabilistic learning, causal models, generative modelling and Bayesian learning. We will discuss fundamental algoritms and concepts, including Expectation-Maximization, sampling and variational approximations, and we will study relevant models from the three fundamental paradigms of probabilistic learning, namely Bayesian networks, Markov networks and dynamic models.  Models covered include: Bayesian Networks, Hidden Markov Models, Markov Random Fields, Boltzmann Machines,  Latent topic models.

      Date Topic  References   Additional Material 
      5  26/02/2025
    (16-18)
    Introduction to Generative Graphical Models I
    P
    robability refresher
    [BRML] Ch. 1 and 2 (Refresher)  
    6 27/02/2025
    (14-16)
    Introduction to Generative Graphical Models II
    G
    raphical model representation; directed and undirected models
    [BRML] Sect. 3.1, 3.2 and 3.3.1
    (conditional independence)
    Software
    • Pyro - Python library based on PyTorch
    • PyMC3 - Python library based on Theano
    • Edward - Python library based on TensorFlow
    • TensorFlow Probability - Probabilistic models and deep learning in Tensorflow
    7 04/03/2025
    (11-13)

    Conditional Independence: Representation and Learning - Part I
    Bayesian networks; representing joint distributions; conditional independence;

    Guest lecture by Riccardo Massidda

    [BRML] Sect. 3.3 (Directed Models and conditional independence)   
      05/03/2025
    (16-18)
    LECTURE CANCELLED DUE TO STUDENT ASSEMBLY    
    8 06/03/2025
    (14-16)

    Conditional Independence: Representation and Learning - Part II
    d-separation; Markov properties; faithfulness; Markov models

    Guest lecture by Riccardo Massidda

    [BRML] Sect. 4.1, 4.2.0-4.2.2 (Undirected Models and Markov Properties) 
    [BRML] Sect. 4.5 (Expressiveness)
     
    9 11/03/2025
    (11-13)

    Graphical Causal Models 

    causation and correlation; causal Bayesian networks; structural causal models; causal Inference

    Guest lecture by Riccardo Massidda

    Barber's book is minimal on causality (only Section 3.4). My suggestions is that you complement the content of the slides (which is sufficient for the exam) with reading from this book, namely:

    -Chapters 2 & 3 (high level introduction to causality)

    - Sections 6.1-6.5 (more technical discussion on lecture content) 

    If you are interested in deepening of your knowledge on causality this is an excellent book (also freely available online): Jonas Peters, Dominik Janzing, Bernhard Schölkopf, Elements of causal inference : foundations and learning algorithms, MIT Press
    10 12/03/2025
    (16-18)

    Structure Learning and Causal Discovery 

    constraint-based methods; score-based methods;
    parametric assumptions 

    Guest lecture by Riccardo Massidda

    [BRML] Sect. 9.5.1 (PC algorithm)
    [BRML] Sect. 9.5.2 (Independence testing)
    [BRML] Sect. 9.5.3 (Structure scoring)

    Additional readings
    [3] A short review of BN structure learning
    [4] PC algorithm with consistent ordering for large scale data
    [5] MMHC - Hybrid structure learning algorithm

    Software
    - A selection of BN structure learning libraries in Python: pgmpy, bnlearn, pomegranate.
    - bnlearn: the most consolidated and efficient library for BN structure learning (in R)
    - Causal learner: a mixed R-Matlab package integrating over 26 BN structure learning algorithms.

    11 13/03/2025
    (14-16)
    Hidden Markov Models  - Part I
    learning in directed graphical models;  generative models for sequential data; hidden/latent variables; inference problems on sequential data
    [BRML] Sect. 23.1.0 (Markov Models)  Additional Readings
    [6]  A classical tutorial introduction to HMMs
      14/03/2025
    (14-16)
    RECOVERY LECTURE CANCELLED DUE TO HYDROLOGICAL RISK    
    12 18/03/2025
    (11-13)

    Hidden Markov Models - Part II
    forward-backward algorithm;  learning as inferenceEM algorithm

    [BRML] Sect. 23.2.0-23.2.4 (HMM and forward backward) 

    [BRML] Sect. 23.3.1-23.3.4 (EM and learning)

     
    13 19/03/2025
    (16-18)
    Hidden Markov Models - Part III
    Viterbi algorithm; dynamic bayesian networks
    [BRML] Sect. 23.2.6 (Viterbi)  
    14 20/03/2025
    (14-16)
    Markov Random Fields I
    learning in undirected graphical  models;

    [BRML] Sect. 4.2.2, 4.2.5 (MRF)

    [BRML] Sect. 4.4 (Factor Graphs)

    [BRML] Sect. 5.1.1 (Variable Elimination and Inference on Chain) 

     
    15

    21/03/2025
    (14-16)

    ROOM L1

    Markov Random Fields II
    conditional random fields; pattern recognition applications

    RECOVERY LECTURE

    [BRML] Sect. 9.6.0, 9.6.1, 9.6.4, 9.6.5 (Learning in MRF/CRF) Additional Readings
    [7,8] Two comprehensive tutorials on CRF ([7] more introductory and [8] more focused on vision)
    [9] A nice application of CRF to image segmentation

    Sofware
    16

    25/03/2025
    (11-13)

    Bayesian Learning I
    Principles of Bayesian learning; EM algorithm objective; principles of variational approximation; latent topic models;

    BRML] Sect. 11.2.1 (Variational EM)  
    17

    26/03/2025
    (16-18)

    Bayesian Learning II
    Latent Dirichlet Allocation (LDA); LDA learning; machine vision application of latent topic models;

    [BRML] Sect. 20.4-20.6.1  (LDA) Additional Readings
    [10] LDA foundation paper
    [11] A gentle introduction to latent topic models
    [12] Foundations of bag of words image representation

    Sofware
    18

    27/03/2025
    (14-16)

    Bayesian Learning III
    sampling methods; ancestral sampling
    ; Gibbs sampling

    [BRML] Sect. 27.1 (sampling), Sect. 27.2 (ancestral sampling), Sect. 27.3 (Gibbs sampling) Additional Readings
    [13] A step-by-step derivation of collapsed Gibbs sampling for LDA

     

    01/04/2025
    (11-13)

    NO LECTURE (Instructor not available)

    Will be recovered on April 11th, h. 14.00

       
    19

    02/04/2025
    (16-18)

    Boltzmann Machines
    bridging neural networks and generative models; stochastic neuron; restricted Boltzmann machine; contrastive divergence and Gibbs sampling in use

      Additional Readings
    [14] A clean and clear introduction to RBM from its author

    Sofware
    Matlab code for Deep Belief Networks (i.e. stacked RBM) and Deep Boltzmann Machines.