Lecture 11.2: the A-W LS in practice
Aggregazione dei criteri
MATLAB implementation of the gradient method for nonlinear functions. Examples on different functions (Rosenbrock's), behaviour on relevant test cases, the role of the (too many) different line search parameters. Take away: the gradient method may be rater slow, but it has at least a pro in that it rarely falls in local minima because it's too "dumb" to see it. An extreme example with the many-minima Ackley function: if by chance the direction is pointing you in the very right spot, then a good (but not too good) LS will give you a great solution quickly. Take away: A-W LS works pretty well as it is, insomuch as it can go, i.e., not much unless the direction happens to be very good.