Lecture 5.2: sublinear convergence and where this leads us
Completion requirements
Plotting the semidefinite convergence, and seeing sublinear convergence "in the flesh". Cautionary tales about the stopping criteria: different tolerances for input and output error, what the norm of the gradient really measures. Why this matters: a sign of things to come. The same algorithm can work very differently on different problems. You can construct cleverer algorithms but there is a limit, so choose your foes (problems) wisely if you can.