比利时vs摩洛哥足彩
,
university of california san diego
****************************
math colloquium
tianhao wang
yale university
algorithm dynamics in modern statistical learning: universality and implicit regularization
abstract:
modern statistical learning is featured by the high-dimensional nature of data and over-parameterization of models. in this regime, analyzing the dynamics of the used algorithms is challenging but crucial for understanding the performance of learned models. this talk will present recent results on the dynamics of two pivotal algorithms: approximate message passing (amp) and stochastic gradient descent (sgd). specifically, amp refers to a class of iterative algorithms for solving large-scale statistical problems, whose dynamics admit asymptotically a simple but exact description known as state evolution. we will demonstrate the universality of amp's state evolution over large classes of random matrices, and provide illustrative examples of applications of our universality results. secondly, for sgd, a workhorse for training deep neural networks, we will introduce a novel mathematical framework for analyzing its implicit regularization. this is essential for sgd's ability to find solutions with strong generalization performance, particularly in the case of over-parameterization. our framework offers a general method to characterize the implicit regularization induced by gradient noise. finally, in the context of underdetermined linear regression, we will show that both amp and sgd can provably achieve sparse recovery, yet they do so from markedly different perspectives.
host: ery arias-castro
december 7, 2023
3:00 pm
apm 6402
****************************