printable pdf
比利时vs摩洛哥足彩 ,
university of california san diego

****************************

center for computational mathematics seminar

valentin duruisseaux - graduate student

uc san diego

a variational approach to accelerated optimization

abstract:

efficient optimization has become one of the major concerns in data analysis. there has been a lot of focus on first-order optimization algorithms because of their low cost per iteration. in 1983, nesterov's accelerated gradient method (nag) was shown to converge in $o(1/k^2)$ to the minimum of the convex objective function $f(x)$, improving on the $o(1/k)$ convergence rate exhibited by the standard gradient descent methods, which is the phenomenon referred to as acceleration. it was shown that nag limits to a second order ode, as the time step goes to 0, and that the objective function $f(x(t)$) converges to its optimal value at a rate of $o(1/t^2)$ along the trajectories of this ode. in this talk, we will discuss how the convergence of $f(x(t))$ can be accelerated in continuous time to an arbitrary convergence rate $o(1/t^p)$ in normed spaces, by considering flow maps generated by a family of time-dependent bregman lagrangian and hamiltonian systems which is closed under time resca ling. we will then discuss how this variational framework can be exploited together with the time-invariance property of the family of bregman lagrangians using adaptive geometric integrators to design efficient explicit algorithms for symplectic accelerated optimization. finally, we will discuss briefly the generalization from normed spaces to riemannian manifolds.

february 2, 2021

10:00 am

zoom meeting id: 950 6794 9984

****************************