Abstract:This chapter discusses regularization methods, which can feasibly lead to simplified models even in the presence of an extremely large number of potential predictors. It discusses how these methods ca...This chapter discusses regularization methods, which can feasibly lead to simplified models even in the presence of an extremely large number of potential predictors. It discusses how these methods can be viewed as providing a tradeoff between accuracy of estimation of the expected response given the predictors and precision of this estimation. This bias-variance tradeoff is fundamentally connected to the tradeoff of fit versus complexity. Regularization methods proceed through shrinkage in which estimates of regression slopes are shrunk towards or to zero. One such method, the lasso, along with its variants, provides model simplification by forcing estimated coefficients to be exactly zero. The chapter describes how model selection criteria can be used to control the sparsity that will be imposed on the fitted model. It also discusses forward stepwise regression and ridge regression.Read More
Publication Year: 2020
Publication Date: 2020-09-01
Language: en
Type: other
Indexed In: ['crossref']
Access and Citation
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot