Lecture 15.1: the scary world of nondifferentiable optimization
Aggregazione dei criteri
Towards less-than-gradient methods: incremental approaches. The other common source of nondifferentiability: Lasso regularizations. MATLAB implementation of a (very simple) nondifferentiable function. Testing all methods seen so far on it and seeing all them break: they all fail differently, but fail they do. Subgradients and subdifferentials (in \R^{n + 1} and \R^n). Complexity results for nondifferentiable optimization. Negative results for nondifferentiable optimization: fixed stepsize just won't work. Wrap up: why nondifferentiable optimization is (much) harder than differentiable optimization.