In general, I’m wondering if there it is ever better not to use orthogonal polynomials when fitting a regression with higher order variables. In particular, I’m wondering with the use of R:
raw = FALSEproduces the same fitted values as
raw = TRUE, and
raw = FALSEsolves some of the problems associated with polynomial regressions, then should
raw = FALSEalways be used for fitting polynomial regressions? In what circumstances would it be better not to use
Ever a reason? Sure; likely several.
Consider, for example, where I am interested in the values of the raw coefficients (say to compare them with hypothesized values), and collinearity isn’t a particular problem. It’s pretty much the same reason why I often don’t mean center in ordinary linear regression (which is the linear orthogonal polynomial)
They’re not things you can’t deal with via orthogonal polynomials; it’s more a matter of convenience, but convenience is a big reason why I do a lot of things.
That said, I lean toward orthogonal polynomials in many cases while fitting polynomials, since they do have some distinct benefits.