Gaussian random field approximation for wide neural networks

Date Range
Mon November 13th 2023, 4:00pm
Location
Sequoia 200
Speaker
Nathan Ross, U Melbourne

It has been observed that wide neural networks (NNs) with randomly initialized weights may be well-approximated by Gaussian fields indexed by the input space of the NN, and taking values in the output space. There has been a flurry of recent work making this observation precise, since it sheds light on regimes where neural networks can perform effectively. In this talk, I will discuss recent work where we derive bounds on Gaussian random field approximation of wide random neural networks of any depth, assuming Lipschitz activation functions. The bounds are on a Wasserstein transport distance in function space equipped with a strong (supremum) metric, and are explicit in the widths of the layers and natural parameters such as moments of the weights. The result follows from a general approximation result using Stein's method, combined with a novel Gaussian smoothing technique for random fields, which I will also describe.

This talk covers joint works with Krishnakumar Balasubramanian, Larry Goldstein, and Adil Salim; and A.D. Barbour and Guangqu Zheng.