In general, I’m wondering if there it is ever better not to use orthogonal polynomials when fitting a regression with higher order variables. In particular, I’m wondering with the use of R:
If
poly()
withraw = FALSE
produces the same fitted values aspoly()
withraw = TRUE
, andpoly
withraw = FALSE
solves some of the problems associated with polynomial regressions, then shouldpoly()
withraw = FALSE
always be used for fitting polynomial regressions? In what circumstances would it be better not to usepoly()
?
Answer
Ever a reason? Sure; likely several.
Consider, for example, where I am interested in the values of the raw coefficients (say to compare them with hypothesized values), and collinearity isn’t a particular problem. It’s pretty much the same reason why I often don’t mean center in ordinary linear regression (which is the linear orthogonal polynomial)
They’re not things you can’t deal with via orthogonal polynomials; it’s more a matter of convenience, but convenience is a big reason why I do a lot of things.
That said, I lean toward orthogonal polynomials in many cases while fitting polynomials, since they do have some distinct benefits.
Attribution
Source : Link , Question Author : user2374133 , Answer Author : Glen_b