Equivalence of sketching and ridge regularization

Tue August 15th 2023, 4:30pm
Sloan 380C
Daniel LeJeune, Stanford Statistics

One of the biggest successes in large scale computation has been the use of randomized sketching to project data into low dimensions and enable efficient computation. Classical theory has established that as long as the sketch size is larger than inherent problem dimensionality, negligible deviation from the solution to the original problem is incurred. What happens when sketch size is not sufficiently large? We show that in least squares problems, the sketching-based solution is equivalent in a first-order sense (biased) to a ridge regularized solution to the original problem. We provide a general asymptotic framework for analyzing sketched matrix inversion that places minimal assumptions on data and applies to a broad class of sketching operators.