f4d9718804a6d310VgnVCM100000c2b1d38dRCRDapproved/UMICH/stats/Home/News & Events/Archived Events/2010-2011 EventsEric Laber###@###(Mon, 21 Feb 2011)Eric Laber###@###(Mon, 21 Feb 2011)438 West HallAdaptive Confidence Intervals for Non-Regular Functionalsstats129829860000012982986000009:30 AM<p><b>Title: </b>Adaptive Confidence Intervals for Non-Regular Functionals<br>
<b>Chair: </b>Professor Susan Murphy<br>
<b>Committee Members: </b>Associate Professor Kerby Shedden, Associate Professor Ji Zhu, Professor Satinder Singh Baveja (EECS)</p>
<p><b>Abstract: </b>Many quantities of interest in modern statistical analysis are non-smooth functionals of the underlying generative distribution, the observed data, or both. Examples include the test error of a learned classifier, parameters indexing an estimated optimal dynamic treatment regime, and the coefficients in a regression model after model selection has been performed. This lack of smoothness can lead to non-regular asymptotics under many “reallife” scenarios and thus invalidate standard statistical procedures like the bootstrap and series approximations. Statistical procedures that either ignore or assume away this non-regularity can perform quite poorly, especially in small samples. The aim of this dissertation is (i) to illustrate the impact that non regularity can have on the performance of statistical inference procedures, especially in small samples, and (ii) the development of tools for conducting theoretically valid statistical inference for non-smooth functionals. In particular, we aim to develop confidence intervals that deliver asymptotically correct coverage under both fixed and local alternatives. To construct confidence intervals we first derive smooth, data-dependent, upper and lower bounds on the functional of interest and then approximate the distribution of the bounds using standard techniques. We then use estimated distributional features, such as the quantiles, to make inference for the original non-smooth functional. We leverage the smoothness of the bounds to obtain consistent inference under both fixed an local alternatives. This consistency is instrumental in ensuring good performance in both in both large and small samples. An important feature of these bounds is that they are adaptive to the underlying non-smoothness of the functional. That is, they are asymptotically tight in the case when the generative distribution happens to induce sufficient smoothness. </p>Njjsantosjjsantos1366830661607c4d9718804a6d310VgnVCM100000c2b1d38d____once11112newnewEvent Flyer/UMICH/stats/Home/Events/Dissertations and Oral Preliminary Examinations/Eric Laber Defense Flyer.pdfEric Laber