How robust are meta-analyses to publication bias? Sensitivity analysis methods and empirical findings

Date
Tue November 8th 2022, 4:30pm
Location
Sloan 380Y
Speaker
Maya Mathur, Stanford Quantitative Sciences Unit

Publication bias can distort meta-analytic results, sometimes justifying considerable skepticism toward meta-analyses. This talk will discuss recently developed statistical sensitivity analyses for publication bias, which enable statements such as: "For publication bias to shift the observed point estimate to the null, 'significant' results would need to be at least 10-fold more likely to be published than negative or 'non-significant' results" or "No amount of publication bias could explain away the average effect." The methods are based on inverse-probability weighted estimators and use robust estimation methods to accommodate non-normal population effects, small meta-analyses, and clustering. Additionally, a meta-analytic point estimate corrected for "worst-case" publication bias can be obtained simply by conducting a standard meta-analysis of only the negative and nonsignificant studies; this method sometimes indicates that no amount of such publication bias could explain away the results. I will describe the results of applying the methods to a systematic sample of 58 meta-analyses across multiple scientific disciplines. All methods are implemented in the R package PublicationBias.