Conditional calibration for false discovery rate control under dependence

Date
Tue November 10th 2020, 4:30pm
Location
SBJC at Stanford
Speaker
Will Fithian, UC Berkeley

Abstract:   We introduce a new class of methods for finite-sample false discovery rate (FDR) control in multiple testing problems with dependent test statistics where the dependence is fully or partially known. Our approach separately calibrates a data-dependent p-value rejection threshold for each hypothesis, relaxing or tightening the threshold as appropriate to target exact FDR control. In addition to our general framework we propose a concrete algorithm, the dependence-adjusted Benjamini–Hochberg (dBH) procedure, which adaptively thresholds the q-value for each hypothesis. Under positive regression dependence the dBH procedure uniformly dominates the standard BH procedure, and in general it uniformly dominates the Benjamini–Yekutieli (BY) procedure (also known as BH with log correction). Simulations and real-data examples illustrate power gains over competing approaches to FDR control under dependence.

This is joint work with Lihua Lei.

Zoom Recording [SUNet/SSO authentication required]