I have seen it claimed in Hosmer & Lemeshow (and elsewhere) that least squares parameter estimation in logistic regression is suboptimal (does not lead to a minimum variance unbiased estimator). Does anyone know other books/articles that show/discuss this explicitly? Google and my personal library have not helped me here…

**Answer**

It is a well known fact that if the model is parametric (that is, specified completely up to a finite number of unknown parameters), and certain regularity conditions hold, then Maximum Likelihood estimation is asymptotically optimal (in the class of regular estimators). I have doubts about the UMVUE concept, since MLE rarely gives unbiased estimators.

The question of *why* is MLE optimal is rather tough, you can check for example Van der Vaart’s “Asymptotic Statistics”, chapter 8.

Now it is known that least squares coincides with MLE if and only if the distribution of error terms in the regression is normal (you can check the OLS article on Wikipedia). Since in logistic regression the distribution is *not normal*, LS will be less efficient than MLE.

**Attribution***Source : Link , Question Author : user9968 , Answer Author : stpasha*