比利时vs摩洛哥足彩
,
university of california san diego
****************************
math 278b - mathematics of information, data, and signals seminar
tino ullrich
tu chemnitz
a new subsampling technique for random points and optimal least squares approximation of high-dimensional functions
abstract:
we provide a new general upper bound for the minimal l2-worst-case recovery error in the framework of rkhs, where only n function samples are allowed. this quantity can be bounded in terms of the singular numbers of the compact embedding into the space of square integrable functions. it turns out that in many relevant situations this quantity is asymptotically only worse by square root of log(n) compared to the singular numbers. the algorithm which realizes this behavior is a weighted least squares algorithm based on a specific set of sampling nodes which works for the whole class of functions simultaneously. these points are constructed out of a random draw with respect to distribution tailored to the spectral properties of the reproducing kernel (importance sampling) in combination with a sub-sampling procedure coming from the celebrated proof of weaver's conjecture, which was shown to be equivalent to the kadison-singer problem. for the above multivariate setting, it is still a fundamental open problem whether sampling algorithms are as powerful as algorithms allowing general linear information like fourier or wavelet coefficients. however, the gap is now rather small. as a consequence, we may study well-known scenarios where it was widely believed that sparse grid sampling recovery methods perform optimally. it turns out that this is not the case for dimensions d greater than 2. \\ \\ this is joint work with n. nagel and m. schaefer from tu chemnitz.
host: rayan saab
february 4, 2021
10:30 am
zoom link: https://msu.zoom.us/j/96421373881 (passcode: first prime number greater than 100)
****************************