Estimating model selection FDR and controlling local FDR

Tue April 2nd 2024, 4:30pm
Sloan 380Y
Will Fithian, UC Berkeley

I will introduce novel methods for two different problems closely related to false discovery rate (FDR) control. In the first part of the talk, I will introduce an FDR estimator for generic variable-selection procedures including Lasso regression, forward-stepwise regression, and the graphical lasso. Our FDR estimator's bias is provably non-negative, and our method can be used as a companion to cross-validation to assess the FDR of variable selection alongside model fit along the method's regularization path. This is joint work with Yixiang Luo and Lihua Lei.

In the second part, I will introduce a simple nonparametric method for local FDR (lfdr) control in the Bayesian two-groups model. Under a monotonicity assumption, our method provably controls the expectation of the maximum lfdr across all rejections; equivalently, it controls the probability that the threshold rejection is a false discovery. Our method asymptotically implements the oracle Bayes procedure for weighted classification based on p-values, optimally trading off between false positives and false negatives. This is joint work with Jake Soloff and Daniel Xiang.

Zoom Recording [SUNet/SSO authentication required]