比利时vs摩洛哥足彩
,
university of california san diego
****************************
math 295 - mathematics colloquium
michael overton
courant institute of mathematical sciences
nonsmooth, nonconvex optimization
abstract:
there are many algorithms for minimization when the objective function is differentiable, convex, or has some other known structure, but few options when none of the above hold, particularly when the objective function is nonsmooth at minimizers, as is often the case in applications. we describe two simple algorithms for minimization of nonsmooth, nonconvex functions. gradient sampling is a relatively new method that, although computationally intensive, has a nice convergence theory. the method is robust and the convergence theory has recently been extended to constrained problems. bfgs is an old method, developed for smooth problems, for which we have very limited theoretical results, but some remarkable empirical observations, extensive success in applications, and a rather bold conjecture. limited memory bfgs is a popular extension for large problems, and it too is applicable to the nonsmooth case, although our experience with it is more mixed.
hosts: philip gill, bill helton and jiawang nie
february 4, 2010
3:00 pm
ap&m 6402
****************************