Title: Quantile function regression and variable selection for sparse models
Abstract:This article considers linear quantile regression and variable selection for high‐dimensional data. In general, an ordinary quantile regression estimator is obtained for a single, fixed quantile level...This article considers linear quantile regression and variable selection for high‐dimensional data. In general, an ordinary quantile regression estimator is obtained for a single, fixed quantile level. Therefore, the estimated coefficient does not have continuity with respect to the quantile level, and hence, the behaviour of the estimator and estimated active variable set could change rapidly for different but sufficiently close quantile levels. To obtain a stable estimator for a given quantile level, this study proposes a new quantile regression method to estimate the coefficient as a function of the quantile level of interest in a given region , which is denoted quantile function regression. In quantile function regression, we approximate the coefficient function of the quantile level using a B ‐spline model, and hence, the estimated conditional quantile is continuous as it is a B ‐spline curve. To employ variable selection, a group lasso‐type sparse penalty is used to estimate a non‐zero coefficient function of the quantile level, which indicates the estimated active set that remains unchanged in . Therefore, quantile function regression can achieve global variable selection. The proposed estimator exhibits an asymptotic rate of convergence and consistency in variable selection. Simulation studies and applications to real data further reveal that the proposed method yields good performance.Read More
Publication Year: 2021
Publication Date: 2021-04-23
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 3
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot