Very different scale parameter estimates in Poisson regression

The background: I’m analysing survival data using a Poisson model. I’ve splitted the data on 2 time-scales (attained age and calendar year). Attained age is modelled using flexible parametric functions, calendar year is a categorical variable and I’ve also included the interaction terms between attained age and calendar year.

The problem: looking at the goodness-of-fit criteria, I see that the deviance divided by the degrees of freedom (df) is 0.03, while the Pearson’s χ2 divided by the df is 5.84. The fact that the former is <1 and the latter is >1 implies that the standard errors associated to my parameter estimates are deflated, if I estimate the scale parameter using Pearson’s χ2 / df, or inflated, if I estimate the scale parameter using Deviance / df. This obviously leads to completely different results.

What could be wrong here?

Edit (04 Jul 2012): I think this question can be reformulated also as: with clustered data (i.e. dependent data within each cluster), are the goodness-of-fit criteria for overdispersion (namely Pearson’s χ2 divided by the residual degrees of freedom and Deviance divided by the residual degrees of freedom) still interpretable? (With iid data, >1 indicates overdispersion, while <1 indicates underdispersion).

Edit (19 Aug 2013): After delving into this topic a little bit more, I no longer think that in this particular scenario the use of those goodness-of-fit criteria makes sense. In fact, we’re only exploiting the GLM framework to fit a survival model characterized by a non-constant baseline incidence rate (see also: piecewise exponential model). We’re not assuming that the outcome has a Poisson distribution. For this reason, the question as reformulated in my last edit (04 Jul 2012) is also wrong, as we don’t have clustered data here. See this answer for more information.

Answer

Attribution
Source : Link , Question Author : boscovich , Answer Author : Community

Leave a Comment