In many applications of statistical estimation via sampling, one may wish to sample from a high-dimensional target distribution that is adaptively evolving to the samples already seen. I will present an example of such dynamics in a Bayesian linear model, given by a Langevin diffusion for sampling from a posterior distribution that adapts to implement empirical Bayes learning of the prior. In this talk, I hope to discuss a positive result on nonparametric consistency for this empirical Bayes learning task, a motivation of these dynamics from a perspective of Wasserstein gradient flows, and a precise characterization of the dynamics in a mean-field setting of i.i.d. regression design.
This is based on joint work with Yandi Shen, Leying Guan, Justin Ko, Bruno Loureiro, Yue M. Lu, and Yihong Wu.