Honors Thesis Presentation: "Assessing the effects of bias correction and undersmoothing in nonparametric inference"

Speaker: Xinyu Guo, Washington University in Saint Louis

Abstract: In the context of kernel density estimation and other nonparametric function estimation problems, a vast literature exists concerning the relative merits and importance of undersmoothing, studentization, and explicit bias correction, in order to achieve good repeated sampling performance of the final inferences. Recently, it has been suggested that the conventional wisdom which argues in favor of undersmoothing, may be incorrect, and that robust bias correction may be the best approach to achieving good coverage accuracy of the resulting interval estimates. The bootstrap also plays a role in selecting tuning parameters, or in estimating the sampling distribution of the nonparametric functional of interest. In this thesis, I examine recent claims about the superiority of robust bias correction through extensive Monte Carlo experiments, utilizing a large set of different estimation problems on which comparisons are based. My findings suggest that the truth about which approaches are best in terms of coverage accuracy is more nuanced, and likely depends heavily on the particular aspects of the estimation problem, with none of the methods under consideration being universally as good or better than the others.

Host: Todd Kuffner

(Access Zoom Presentation HERE)