Pros and cons
The advantages of the LARS method are: # It is computationally just as fast as forward selection. # It produces a full piecewise linear solution path, which is useful in cross-validation or similar attempts to tune the model. # If two variables are almost equally correlated with the response, then their coefficients should increase at approximately the same rate. The algorithm thus behaves as intuition would expect, and also is more stable. # It is easily modified to produce efficient algorithms for other methods producing similar results, like the lasso and forward stagewise regression. # It is effective in contexts where ''p'' >> ''n'' (i.e., when the number of predictors ''p'' is significantly greater than the number of points ''n'') The disadvantages of the LARS method include: # With any amount of noise in the dependent variable and with high dimensional multicollinear independent variables, there is no reason to believe that the selected variables will have a high probability of being the actual underlying causal variables. This problem is not unique to LARS, as it is a general problem with variable selection approaches that seek to find underlying deterministic components. Yet, because LARS is based upon an iterative refitting of the residuals, it would appear to be especially sensitive to the effects of noise. This problem is discussed in detail by Weisberg in the discussion section of the Efron et al. (2004) Annals of Statistics article. Weisberg provides an empirical example based upon re-analysis of data originally used to validate LARS that the variable selection appears to have problems with highly correlated variables. # Since almost allAlgorithm
The basic steps of the Least-angle regression algorithm are: * Start with all coefficients equal to zero. * Find the predictor most correlated with . * Increase the coefficient in the direction of the sign of its correlation with . Take residuals along the way. Stop when some other predictor has as much correlation with as has. * Increase (, ) in their joint least squares direction, until some other predictor has as much correlation with the residual . * Increase (, , ) in their joint least squares direction, until some other predictor has as much correlation with the residual . * Continue until: all predictors are in the model.Software implementation
Least-angle regression is implemented in R via thSee also
*References
{{DEFAULTSORT:Least-Angle Regression Estimation theory Parametric statistics Regression variable selection Single-equation methods (econometrics)