# Assumptions of generalized linear models

On page 232 of “An R companion to applied regression” Fox and Weisberg note

Only the Gaussian family has constant variance, and in all other GLMs the conditional variance of y at $\bf{x}$ depends on $\mu(x)$

Earlier, they note that the conditional variance of the Poisson is $\mu$ and that of the binomial is $\frac{\mu(1-\mu)}{N}$.

For the Gaussian, this is a familiar and often checked assumption (homoscedasticity). Similarly, I often see the conditional variance of the Poisson discussed as an assumption of Poisson regression, together with remedies for cases when it is violated (e.g. negative binomial, zero inflated, etc). Yet I never see the conditional variance for the binomial discussed as an assumption in logistic regression. A little Googling did not find any mention of it.

What am I missing here?

EDIT subsequent to @whuber ‘s comment:

As suggested I am looking through Hosmer & Lemeshow. It is interesting and I think it shows why I (and perhaps others) are confused. For example, the word “assumption” is not in the index to the book. In addition, we have this (p. 175)

In logistic regression we have to rely primarily on visual assessment, as the distribution of the diagnostics under the hypothesis that the model fits is known only in certain limited settings

They show quite a few plots, but concentrate on scatterplots of various residuals vs the estimated probability. These plots (even for a good model, do not have the “blobby” pattern characteristic of similar plots in OLS regression, and so are harder to judge. Further, they show nothing akin to quantile plots.

In R, plot.lm offers a nice default set of plots to assess models; I do not know of an equivalent for logistic regression, although it may be in some package. This may be because different plots would be needed for each type of model. SAS does offer some plots in PROC LOGISTIC.

This certainly seems to be an area of potential confusion!