Section outline
-
We close the gap between neural networks and probabilistic learning by discussing generative deep learning models. We discuss a general taxonomy of the existing learning models and study in-depth relevant families of models for each element of the taxonomy, including: autoregressive generation, variational autoencoders, generative adversarial networks, diffusion models, flow-based methods.
Date Topic References Additional Material 29 30/04/2025
(16-18)Explicit Density Learning
explicit distribution models; neural ELBO; variational autoencoders
[SD] Chapter 14 (generative learning), Chapter 17 (VAE)Additional Readings
[40] PixelCNN - Explict likelihood model
[41] Tutorial on VAE
Sofware
30 06/05/2025
(11-13)Implicit models - Adversarial Learning
generative adversarial networks; wasserstein GANs; conditional generation; notable GANs; adversarial autoencoders[SD] Chapter 15 Additional Readings
[42] Tutorial on GAN (here another online resource with GAN tips)
[43] Wasserstein GAN
[44] Tutorial on sampling neural networks
[45] Progressive GAN
[46] Cycle Gan
[47] Seminal paper on Adversarial AEs
Sofware
- Official Wasserstein GAN code
- A (long) list of GAN models with (often) associated implementation
31 07/05/2025
(16-18)Diffusion models I
noising-denoising processes; kernelized diffusion;[SD] Chapter 18 Additional Readings
[48] Introductory and survey paper on diffusion models
[49] Seminal paper introducing diffusion models
[50] An intepretation of diffusion models as score matching
[51] Paper introducing the diffusion model reparameterization
[52] Diffusion beats GAN paper32 08/05/2025
(14-16)Diffusion models II
latent space diffusion; conditional diffusion models33 13/05/2025
(11-13)Normalizing flow models
probabilistic change of variable, forward/normalization pass; from 1D to multidimensional flows; survey of notable flow models; wrap-up of deep generative learning
[SD] Chapter 16 Additional Readings
[53] Survey paper on normalizing flows
[54] RealNVP paper
[55] GLOW paper
[56] MADE autoregressive flow
Sofware
- Normalizing flows are implemented natively in Tensorflow Probability
- Two PyTorch-based packages for Normalizing Flows: Normflows (pure PyTorch) - Flowtorch (PyRo)