比利时vs摩洛哥足彩
,
university of california san diego
****************************
math 278b- mathematics of information, data, and signals seminar
roberto imbuzeiro oliveira
impa, rio de janeiro
sample average approximation with heavier tails
abstract:
consider an ``ideal" optimization problem where constraints and objective function are defined in terms of expectations over some distribution p. the sample average approximation (saa) -- a fundamental idea in stochastic optimization -- consists of replacing the expectations by an average over a sample from p. a key question is how much the solutions of the saa differ from those of the original problem. results by shapiro from many years ago consider what happens asymptotically when the sample size diverges, especially when the solution of the ideal problem lies on the boundary of the feasible set. in joint work with philip thompson (purdue), we consider what happens with finite samples. as we will see, our results improve upon the nonasymptotic state of the art in various ways: we allow for heavier tails, unbounded feasible sets, and obtain bounds that (in favorable cases) only depend on the geometry of the feasible set in a small neighborhood of the optimal solution. our results combine ``localization" and ``fixed-point" type arguments inpired by the work of mendelson with chaining-type inequalities. one of our contributions is showing what can be said when the saa constraints are random.
host: rayan saab
march 18, 2021
11:30 am
zoom link: https://msu.zoom.us/j/96421373881 (passcode: first prime number $>$ 100)
****************************