Section outline

  • The module presents the fundamental concepts, challenges, architectures and methodologies of deep learning. We introduce the learning of neural representations from vectorial, sequential and image data, covering both supervised and unsupervised learning, and hinting at various forms of weak supervision.  Models covered include: deep autoencoders, convolutional neural networks, long-short term memory, gated recurrent units, advanced recurrent architectures, sequence-to-sequence, neural attention, Transformers, neural Turing machines. Methodological lectures will be complemented by introductory seminars to Keras-TF and Pytorch.

      Date Topic References   (NEW)  Additional Material 
    20  03/04/2025 Deep Autoencoders
    S
    parse, denoising and contractive AE; deep RBM
    [SD] Coverage of the Prince book on this lecture is inadequate but you can use the lecture slides and complement with the additional material if necessary. (e.g. chapter 14 of the deep learning book). Additional Readings
    [15] DBN: the paper that started deep learning
    [16] Deep Boltzmann machines paper
    [17] Review paper on deep generative models
    [18] Long review paper on autoencoders from the perspective of representation learning
    [19] Paper discussing regularized autoencoder as approximations of likelihood gradient
    21 08/04/2025
    (11-13)
    Convolutional Neural Networks I
    Introduction to the deep learning module; i
    ntroduction to CNN; basic CNN elements
    [SD] Chapter 10

    Additional Readings

    [20-24] Original papers for LeNet, AlexNet, VGGNet, GoogLeNet and ResNet.

    22 09/04/2025
    (16-18)
     Convolutional Neural Networks II
    CNN architectures for image recognition; convolution visualization; advanced topics (deconvolution, dense nets); applications and code

    [SD] Chapter 10

    Additional Readings
    [25] Complete summary of convolution arithmetics
    [26] Seminal paper on batch normalization
    [27] CNN interpretation using deconvolutions
    [28] CNN interpretation with GradCAM[

    29] Seminal paper on dilated convolutions

    [30] Object detection by Faster RCNN

    23 10/04/2025
    (14-16)
    Gated Recurrent Networks I
    Deep learning for sequence processing; gradient issues;
    Coverage of Prince book on this lecture is inadequate (for reasons I do not understand). You can use the course slides for this topic, and if you like you can integrate those with chapter 10 from the Deep Learning Book.
    Additional Readings
    [31] Paper describing gradient vanish/explosion
    24 11/04/2025
    (14-16)
    ROOM D3
    Gated Recurrent Networks II
    long-short term memory; gated recurrent units; generative use of RNN
    RECOVERY LECTURE
     

    Additional Readings
    [32] Original LSTM paper
    [33] An historical view on gated RNN

    [34] Gated recurren units paper
    [35] Seminal paper on dropout regularization

    Software

    25 15/04/2025
    (11-13)
    Attention-based architectures
    sequence-to-sequence;  attention modules; transformers and vision transformers
    [SD] Chapter 12 Additional Readings
    [36,37] Models of sequence-to-sequence and image-to-sequence transduction with attention
    [38] Seminal paper on Transformers 
    [39] Transformers in vision
    26 16/04/2025
    (16-18)

    Coding practice I - Guest lecture by Riccardo Massidda

    Pytorch

       
    27 17/04/2025
    (14-16)

    Coding practice II - Guest lecture by Riccardo Massidda

    Keras/TensorFlow

       
      18/04/2025 - 25/04/2025

    Spring Break: No Lectures

       
    28 29/04/2025
    (11-13)

    Memory-based models
    multiscale network; hierarchical models; memory networks; neural Turing machines