Feb
21
2011

Add to Cal
  • Speaker: Eric Laber
  • Host Department: Statistics
  • Date: 02/21/2011
  • Time: 9:30 AM

  • Location: 438 West Hall

  • Description:

    Title: Adaptive Confidence Intervals for Non-Regular Functionals
    Chair: Professor Susan Murphy
    Committee Members: Associate Professor Kerby Shedden, Associate Professor Ji Zhu, Professor Satinder Singh Baveja (EECS)

    Abstract: Many quantities of interest in modern statistical analysis are non-smooth functionals of the underlying generative distribution, the observed data, or both. Examples include the test error  of a learned classifier, parameters indexing an estimated optimal dynamic treatment regime, and the coefficients in a regression model after model selection has been performed. This lack of smoothness can lead to non-regular asymptotics under many “reallife” scenarios and thus invalidate standard statistical procedures like the bootstrap and series approximations. Statistical procedures that either ignore or assume away this non-regularity can perform quite poorly, especially in small samples. The aim of this dissertation is (i) to illustrate the impact that non regularity can have on the performance of statistical inference procedures, especially in small samples, and (ii) the development of tools for conducting theoretically valid statistical inference for non-smooth functionals. In particular, we aim to develop confidence intervals that deliver asymptotically correct coverage under both fixed and local alternatives. To construct confidence intervals we first derive smooth, data-dependent, upper and lower bounds on the functional of interest and then approximate the distribution of the bounds using standard techniques. We then use estimated distributional features, such as the quantiles, to make inference for the original non-smooth functional. We leverage the smoothness of the bounds to obtain consistent inference under both fixed an local alternatives. This consistency is instrumental in ensuring good performance in both in both large and small samples. An important feature of these bounds is that they are adaptive to the underlying non-smoothness of the functional. That is, they are asymptotically tight in the case when the generative distribution happens to induce sufficient smoothness.  

  • Event Flyer