M-statistics: Optimal exact statistical inference for a small sample

Tue March 5th 2024, 4:30pm
Sloan 380Y
Eugene Demidenko, Dartmouth College

The talk aims to introduce a new statistical inference where neither mean nor variance plays a role. The current practice of statistical inference relies on asymptotic methods, such as maximum likelihood (ML). The small-sample exact statistical inference is available only for a few examples, primarily linear models. Our theory requires a statistic with a known cumulative distribution function dependent on an unknown parameter. Two parallel competing tracks of inferences are offered under the umbrella of M-statistics: maximum concentration (MC) and mode (MO) statistics, which is why M=MC+MO. Having an optimal exact dual double-sided confidence interval (CI) and test, the point estimator is derived as the limit point of the CI when the confidence level approaches zero. When statistic is sufficient, the MO-estimator, as the limit of the unbiased CI, coincides with the ML estimator. Our theory extends to multiparameter statistical inference.

Novel optimal short (MC statistics) and unbiased (MO statistics) confidence intervals, dual tests with respective power functions and point estimators are derived for major statistical parameters: standard deviation, coefficient of variation, effect size, binomial probability, Poisson rate, and correlation coefficient, among others. The R codes are readily available via GitHub.

The talk is illustrated with the estimation of the binomial probability and the well-forgotten Laplace law of succession estimator, which receives a new look in the M-statistic theory.