da8e4759a4401410VgnVCM100000c2b1d38dRCRDapproved/UMICH/stats/Home/News & Events/Dissertations and Oral Preliminary ExaminationsSeyoung Park###@###(Tue, 10 Sep 2013)Seyoung Park###@###(Tue, 10 Sep 2013)438 WHInterquantile regression with high dimensional covariatesstats137883240000013788324000001:00PM<p>Much work has been done in the quantile regression analysis. The ordinary quantile regression<br> estimation is asymptotically consistent when the number of predictors, p, is small, but<br> it could be inconsistent if p grows with the sample size n. This motivates the use of penalization<br> methods in model selection for quantile regression. Belloni and Chernozhukov<br> (2011) consider the l1-penalized quantile regression in high dimensional sparse models and<br> obtain some non-asymptotic results. Jiang (2012) introduces a new estimator that estimates<br> several quantiles at the same time while penalizing inter-quantile differences as well as individual<br> quantile coefficients, but provides no theoretical results for high dimensional predictors.<br> In this work, we consider joint quantile regression in high-dimensional sparse<br> models by allowing the number of quantiles, K, and the number of predictors, p, to grow<br> with n. We provide asymptotic and non-asymptotic results on the penalized model selection<br> and estimation procedures in the spirit of Jiang (2012), and study the oracle properties<br> of adaptive penalization. We examine model selection consistency and stability across<br> quantiles, and compare adaptive shrinkage with thresholding in the quantile regression<br> framework.</p>Njkmcdonjkmcdon1378757185263aa8e4759a4401410VgnVCM100000c2b1d38d____once11112newnewSeyoung Park/UMICH/stats/Home/News & Events/Dissertations and Oral Preliminary Examinations/Seyoung Park PreLim Flyer.pdf