Probability for Machine Learning Seminar

Title:        CLTs in deep neural networks: quantitative bounds through coupling, Stein's method and entropy.

 Abstract: I will discuss several recent results, allowing one to assess the discrepancy between a randomly initialized neural network and its Gaussian counterpart, in the infinite-width limit. In a functional framework, our techniques are based on the use of coupling techniques for Gaussian processes, revolving around some novel variations of the Powers-Stormer inequalities. In a finite-dimensional setting, our results yield optimal bounds in the 1-Wasserstein, 2-Wasserstein and total variation distances, either through Stein's method (dimension 1) or via the use of information theoretical tools (Pinsker and Talagrand inequalities). If time permits, I will also discuss some consequences of our results in a Bayesian setting. Based on two joint works: (i) with S. Favaro, B. Hanin, D. Marinucci and I. Nourdin (PTRF, 2025), and (ii) with L. Celli (in preparation).

Short bio: Giovanni Peccati is full Professor in Mathematics at the University of Luxembourg. He is also the Head of the Department of Mathematics and leads the group Revealing order in randomness. He is the President of the Luxembourg Mathematical Society. Between 2003 and 2008, he was Assistant Professor at Sorbonne Université and from 2008 to 2010 was appointed full Professor at Université Paris-Nanterre. He joined the University of Luxembourg in 2010. In 2018, Giovanni Peccati was named IMS Fellow and in 2015, he received the FNR Award for Outstanding Scientific Publication.