printable pdf
比利时vs摩洛哥足彩 ,
university of california san diego

****************************

computational geometric mechanics research seminar

valentin duruisseaux

ucsd

accelerated optimization via geometric numerical integration

abstract:

efficient optimization has become one of the major concerns in machine learning, and there has been a lot of focus on first-order optimization algorithms because of their low cost per iteration. in 1983, nesterov's accelerated gradient method (nag) was shown to converge in $\mathcal{o}(1/k^2)$ to the minimum of the convex objective function $f$, improving on the $\mathcal{o}(1/k)$ convergence rate exhibited by the standard gradient descent methods, which is the phenomenon referred to as acceleration. it was shown that nag limits to a second order ode, as the time-step goes to 0, and that the objective function $f(x(t))$ converges to its optimal value at a rate of $\mathcal{o}(1/t^2)$ along the trajectories of this ode. in this talk, we will discuss how the convergence of $f(x(t))$ can be accelerated in continuous time to an arbitrary convergence rate $\mathcal{o}(1/t^p)$ in normed spaces, by considering flow maps generated by a family of time-dependent bregman lagrangian and hamiltonian systems which is closed under time rescaling. we will then discuss how this variational framework can be exploited together with the time-invariance property of the family of bregman dynamics using adaptive geometric integrators to design efficient explicit algorithms for accelerated optimization. we will then discuss how these results and computational methods can be generalized from normed spaces to riemannian manifolds. finally, we will discuss some practical considerations which can be used to improve the performance of the algorithms. 

october 25, 2022

9:30 am

apm 6402

****************************