Lecture 2.2: starting very very easy and very slowly ramping up
Aggregazione dei criteri
Issue: optimization is impossible / very hard unless the problem is very kind on you and/or is very small. "Know your enemy" and "choose wisely the battles you fight". Starting (too) simple: optimising univariate linear and quadratic functions, possibly on an interval. An excessively involved proof of optimality conditions for the nonhomogeneous quadratic case (but a sign of things to come). Preliminaries to (unconstrained) multivariate optimization, the necessary concepts in \R^n: vector space, scalar product, norm, distance. Picturing functions in \R^n: (epi)graph, (sub)level sets and tomography. First examples: linear functions.