Nonparametric sharp bounds and optimal design in police oversight
Analyses of police misconduct rely heavily on self-reported law-enforcement data that suffers from unobserved confounding, mismeasurement, and selection issues. I first show how these challenges have led to distortions in prior policing research, including a high-profile retraction. Next, I demonstrate how nonparametric sharp bounds can help researchers address these challenges without relying on implausible assumptions used in past work, often inadvertently or implicitly. I introduce an algorithm for obtaining such bounds for any structured discrete system and essentially any estimand, assumptions, and arbitrarily incomplete dataset(s)–flexibly accommodating the wide variety of oversight tasks and data environments across America's 18,000 law enforcement agencies. The algorithm outputs a fail-to-reject region capturing both fundamental lack of identification as well as sampling uncertainty. Finally, I propose an approach for targeting future data collection under budget constraints, such as expert review of body-worn camera footage, to optimally narrow the expected fail-to-reject region. Applications of the proposed techniques are illustrated using collaborations with national civil-rights organizations.