Vai al contenuto principale
INF - e-learning - Dipartimento di Informatica
  • Italiano ‎(it)‎
    English ‎(en)‎ Italiano ‎(it)‎
Ospite (Login)

Computational Mathematics for Learning and Data Analysis - AA 2024/25

  1. Home
  2. Corsi
  3. Corso di Laurea Magistrale in Informatica (LM-18)
  4. CM24
  5. Lectures Recordings: Optimization
  6. Lecture 11.2: the A-W LS in practice

Lecture 11.2: the A-W LS in practice

Aggregazione dei criteri

MATLAB implementation of the gradient method for nonlinear functions. Examples on different functions (Rosenbrock's), behaviour on relevant test cases, the role of the (too many) different line search parameters. Take away: the gradient method may be rater slow, but it has at least a pro in that it rarely falls in local minima because it's too "dumb" to see it. An extreme example with the many-minima Ackley function: if by chance the direction is pointing you in the very right spot, then a good (but not too good) LS will give you a great solution quickly. Take away: A-W LS works pretty well as it is, insomuch as it can go, i.e., not much unless the direction happens to be very good.

Lecture 11.2: the A-W LS in practice
◄ Lecture 11.1: convergence with the A-W LS, theory
Lecture 12.1: "extremely inexact LS": fixed stepsize ►

Blocchi

Salta Navigazione

Navigazione

  • Home

    • Pagine del sito

      • I miei corsi

      • Tag

      • ForumSite news

    • I miei corsi

    • Corsi

      • Corso di Laurea Magistrale in Informatica (LM-18)

        • CNS 2025

        • CMCS 2025

        • P2P2425

        • IQC(24-25)

        • ADB 24/25

        • CL 24/25

        • ICT-RA

        • AIF24-25

        • ML 2024

        • CM24

          • General Information

          • Slides: Numerical Linear Algebra

          • Slides: Optimization

          • Optimization & Learning Lecture Notes

          • Lecture Recordings: Numerical Linear Algebra

          • Lectures Recordings: Optimization

            • FileLecture 1.1 - introduction to the course

            • FileLecture 1.2 - motivation for the course: four exam...

            • FileLecture 2.1: general notions of optimization

            • FileLecture 2.2: starting very very easy and very slow...

            • FileLecture 3.1: multivariate optimization: initial co...

            • FileLecture 3.2: "real" quadratic functions and how th...

            • FileLecture 4.1: quadratic optimization: from optimali...

            • FileLecture 4.2: the gradient method for quadratic fun...

            • FileLecture 5.1: convergence rates: from the gradient ...

            • FileLecture 5.2: sublinear convergence and where this ...

            • FileLecture 6.1: optimizing more general functions, bu...

            • FileLecture 6.2: first steps with local optimization: ...

            • FileLecture 7.1: dichotomic search, from naive to mod...

            • FileLecture 7.2: faster local optimization and the rol...

            • FileLecture 8.1: closing thoughts of univariate optimi...

            • FileLecture 8.2: theory of gradients and Hessians towa...

            • FileLecture 9.1: local first- and second-order optimal...

            • FileLecture 10.1: the gradient method with "exact" lin...

            • FileLecture 10.2: inexact line search, the Armijo-Wolf...

            • FileLecture 11.1: convergence with the A-W LS, theory

            • FileLecture 11.2: the A-W LS in practice

            • FileLecture 12.1: "extremely inexact LS": fixed stepsize

            • FileLecture 12.2: gradient twisting approaches at thei...

            • FileLecture 13.1: all around Newton's method

            • FileLecture 13.2: towards the very-large-scale, quasi-...

            • FileLecture 14.1: deflected gradient methods I - Conju...

            • FileLecture 14.2: deflected gradient methods II - Heav...

            • FileLecture 15.1: the scary world of nondifferentiable...

            • FileLecture 15.2: (convex) nondifferentiable optimizat...

            • FileLecture 16.1: better nondifferentiable approachess...

            • FileLecture 16.2: first steps on constrained optimization

            • FileLecture 17.1: algebraic representation of feasibl...

            • FileLecture 17.2: from the KKT conditions to duality

            • FileLecture 18.1: first step in constrained optimization

            • FileLecture 18.2: more (projected gradient) steps in c...

            • FileLecture 19.1: from Frank-Wolfe to the dual method

            • FileLecture 19.2: ending with a bang: the (primal-dual...

          • Software and Data: Numerical Analysis

          • Software and Data: Optimization

          • Projects

        • SDC 24/25

      • Corso di Laurea in Informatica (L-31)

      • Corso di Laurea Magistrale in Informatica e Networ...

      • Corso di Laurea Magistrale in Data Science and Bus...

      • Corso di Laurea Magistrale in Informatics for Digi...

      • Corsi erogati dal Dipartimento di Matematica

      • Master di II livello in "Professione formatore in ...

      • Corsi CLIL

      • Altri Corsi

      • Anno Accademico 2013-14

Blocchi

Ospite (Login)
CM24
  • Italiano ‎(it)‎
    • English ‎(en)‎
    • Italiano ‎(it)‎
Riepilogo della conservazione dei dati
Ottieni l'app mobile