Lecture 15.2: (convex) nondifferentiable optimization, converging against all odds
Aggregazione dei criteri
Negative results for nondifferentiable optimization: fixed stepsize just won't work. Wrap up: why nondifferentiable optimization is (much) harder than differentiable optimization. Subgradient methods. Basic ideas, DSS and Polyak stepsizes, efficiency estimates. Making Polyak stepsize possible: target-level stepsize. MATLAB implementation of the subgradient method. Behaviour in practice. Take away: with the right parameter tuning you can do it, but it's not fast and you don't really know when to stop.